Skip to main content

Before you start - parallel observability for cloud and AI

This page covers the prerequisites, tools, and conceptual model you'll need before working through the pipeline and observability documentation. Whether you're setting up log ingestion in Datadog, tracing large language model (LLM) calls in Galileo, or both, start here.

What this section covers

This documentation covers two complementary observability workflows:

  1. Operational log ingestion and routing with Datadog

    • Ingest JSON logs from cloud apps via the Datadog Log Ingestion application programming interface (API), apply pipeline processors, and route data to downstream destinations.
  2. LLM tracing and evaluation with Galileo

    • Instrument AI-powered services with the Galileo software development kit (SDK) to capture traces, spans, and evaluation metrics for every LLM call.

Follow the docs in the recommended order:

Start with concepts

Event Streams & Observability Pipelines

Then learn the ingestion API

Ingest events using the Datadog Log Ingestion API

Then follow the hands-on guide

Routing cloud app logs with Datadog and Galileo

Tools you'll need

For Datadog log ingestion

  • cURL or Postman—Use these tools for sending API requests to the Datadog Log Ingestion API.
  • A terminal or shell—macOS, Linux, or Windows PowerShell all work.
  • jq (optional but recommended)—Use this tool for validating JSON payloads before sending them.

For Galileo LLM tracing

  • Python 3.8+ or Node.js 18+—The Galileo SDK supports both environments.

  • pip or npm—Use these tools to install the SDK.

    pip install galileo
  • A code editor, preferably VS Code (Cursor, Antigravity, or Codex are acceptable alternatives).

Credentials needed

Datadog

CredentialWhere to find it
API keyDatadog → Organization SettingsAPI Keys
Datadog site addressDepends on your region, for example, datadoghq.com (US) or datadoghq.eu (EU). See Datadog sites

Galileo

CredentialWhere to find it
API keyapp.galileo.aiSettingsAPI Keys
Project nameCreated when you set up a new project in Galileo
Log stream nameCreated per environment, such as dev, staging, or production.

See Where do I find my project keys? in the Galileo docs.

Who this documentation is for

PersonaPrimary use
Platform engineersBuild and maintain scalable Datadog log pipelines
Site reliability engineers (SREs) and DevOpsNormalize logs, reduce noise, and set up routing and alerting
AI and machine learning (ML) engineersInstrument LLM services and score model quality with Galileo
Security engineersRoute audit and authentication logs to security information and event management (SIEM) destinations
DevelopersSend app logs and trace LLM calls without deep infrastructure knowledge

What you should already know

These docs assume:

  • Basic familiarity with JSON
  • Comfort running command-line commands
  • A general understanding of logs, events, or metrics
  • Awareness of cloud or microservice environments

If you haven't sent a POST request before, the hands-on guide walks through the process step by step.

Conceptual model

All workflows in this section follow a two-track observability model:

Observability pipeline architecture

Datadog handles your operational telemetry, including infrastructure logs, error rates, routing rules, and alerting.

Galileo handles your AI telemetry, including LLM inputs and outputs, latency per span, and evaluation scores.

Together they provide full-stack visibility across both layers of a modern cloud app.