Before you start - parallel observability for cloud and AI
This page covers the prerequisites, tools, and conceptual model you'll need before working through the pipeline and observability documentation. Whether you're setting up log ingestion in Datadog, tracing large language model (LLM) calls in Galileo, or both, start here.
What this section covers
This documentation covers two complementary observability workflows:
-
Operational log ingestion and routing with Datadog
- Ingest JSON logs from cloud apps via the Datadog Log Ingestion application programming interface (API), apply pipeline processors, and route data to downstream destinations.
-
LLM tracing and evaluation with Galileo
- Instrument AI-powered services with the Galileo software development kit (SDK) to capture traces, spans, and evaluation metrics for every LLM call.
Follow the docs in the recommended order:
Start with concepts
Event Streams & Observability Pipelines
Then learn the ingestion API
Ingest events using the Datadog Log Ingestion API
Then follow the hands-on guide
Routing cloud app logs with Datadog and Galileo
Tools you'll need
For Datadog log ingestion
- cURL or Postman—Use these tools for sending API requests to the Datadog Log Ingestion API.
- A terminal or shell—macOS, Linux, or Windows PowerShell all work.
jq(optional but recommended)—Use this tool for validating JSON payloads before sending them.
For Galileo LLM tracing
-
Python 3.8+ or Node.js 18+—The Galileo SDK supports both environments.
-
pipornpm—Use these tools to install the SDK.- Python
- TypeScript
pip install galileonpm install galileo -
A code editor, preferably VS Code (Cursor, Antigravity, or Codex are acceptable alternatives).
Credentials needed
Datadog
| Credential | Where to find it |
|---|---|
| API key | Datadog → Organization Settings → API Keys |
| Datadog site address | Depends on your region, for example, datadoghq.com (US) or datadoghq.eu (EU). See Datadog sites |
Galileo
| Credential | Where to find it |
|---|---|
| API key | app.galileo.ai → Settings → API Keys |
| Project name | Created when you set up a new project in Galileo |
| Log stream name | Created per environment, such as dev, staging, or production. |
See Where do I find my project keys? in the Galileo docs.
Who this documentation is for
| Persona | Primary use |
|---|---|
| Platform engineers | Build and maintain scalable Datadog log pipelines |
| Site reliability engineers (SREs) and DevOps | Normalize logs, reduce noise, and set up routing and alerting |
| AI and machine learning (ML) engineers | Instrument LLM services and score model quality with Galileo |
| Security engineers | Route audit and authentication logs to security information and event management (SIEM) destinations |
| Developers | Send app logs and trace LLM calls without deep infrastructure knowledge |
What you should already know
These docs assume:
- Basic familiarity with JSON
- Comfort running command-line commands
- A general understanding of logs, events, or metrics
- Awareness of cloud or microservice environments
If you haven't sent a POST request before, the hands-on guide walks through the process step by step.
Conceptual model
All workflows in this section follow a two-track observability model:
- Mermaid (image)
- Mermaid (code)
- ASCII
flowchart LR
A["Cloud App"] -->|HTTP POST| B["Datadog<br/>Log Ingestion API"]
B --> C["Pipeline<br/> • Parse<br/> • Enrich<br/> • Route"]
C --> D{{"Destinations<br/> • S3<br/> • SIEM<br/> • Alerts"}}
A -->|Galileo SDK| E["Galileo<br/>Log Stream"]
E --> F["Galileo AI Agent"]
F --> G{{"Metrics and<br/>Evaluation"}}
[Cloud App]
/ \
|HTTP POST| |Galileo SDK|
v v
[Datadog] [Galileo]
[Ingest API] [Log Stream]
| |
v v
[Pipeline] [Galileo]
(Parse, Route) [AI Agent]
| |
v v
[Destinations] [Metrics and]
(S3, SIEM) [Evaluation ]
Datadog handles your operational telemetry, including infrastructure logs, error rates, routing rules, and alerting.
Galileo handles your AI telemetry, including LLM inputs and outputs, latency per span, and evaluation scores.
Together they provide full-stack visibility across both layers of a modern cloud app.