Skip to main content

Ingest events with the Datadog Log Ingestion API

Introduction

Use the Datadog Log Ingestion application programming interface (API) to send JSON log events from cloud apps, microservices, or any system emitting telemetry directly into Datadog Log Management. Once you ingest logs, they're available in Log Explorer within seconds. You can process, route, and archive them using Log Pipelines.

This endpoint supports high-throughput, real-time ingestion of structured and semi-structured JSON payloads.

Actions supported by this API

  • Ship logs directly from cloud apps without installing the Datadog Agent.
  • Apply pipeline processors, such as Grok Parser, Remapper, and Lookup Processor, to incoming data.
  • Route logs to indexes, archives, or alert destinations based on content.
  • Reduce indexing cost by combining this endpoint with sampling and filtering rules.
note

Learn more about Datadog's API and API keys here.

Before you start

Make sure you have:

  • A Datadog API key with log ingestion permissions. Find this in Organization SettingsAPI Keys.

  • Your Datadog site address. This address varies by region:

    RegionSite address
    US (east)datadoghq.com
    US3 (west)us3.datadoghq.com
    US5 (central)us5.datadoghq.com
    EU (Europe)datadoghq.eu
    AP1 (Japan)ap1.datadoghq.com
    AP2 (Australia)ap2.datadoghq.com
  • curl or Postman installed.

  • JSON-formatted log payloads.

Pipeline overview

Datadog log ingestion pipeline

Ingest endpoint

POST https://http-intake.logs.{dd_site}/api/v2/logs

Replace {dd_site} with your region's site address, for example, datadoghq.com or ap2.datadoghq.com.

Request headers

HeaderValueRequiredDescription
DD-API-KEY<your_api_key>YesYour Datadog API key
Content-Typeapplication/jsonYesPayload format

Request body

The request body is a JSON array of one or more log objects. Each log object supports the following fields:

FieldTypeRequiredDescription
messagestringYesThe log message body
ddsourcestringRecommendedThe technology the log originates from, for example, python or nginx.
ddtagsstringOptionalComma-separated tags, for example, env:prod,team:payments.
hostnamestringOptionalThe name of the host that generated the log
servicestringRecommendedThe name of the app or service
note

You must include the message field. All other fields are optional butvDatadog recommends them. Datadog uses service, ddsource, and ddtags for filtering,vfaceting, and pipeline matching.

Example payload

[
{
"message": "Transaction failed: Gateway timeout",
"ddsource": "payment-gateway",
"ddtags": "env:prod,region:us-east-1",
"hostname": "payments-host-01",
"service": "payment-gateway",
"timestamp": "2025-11-15T08:30:00Z",
"transaction_id": "txn_998877",
"customer_id": "cus_554433",
"level": "ERROR"
}
]
note

The timestamp must use International Organization for Standardization (ISO) 8601 Coordinated Universal Time (UTC) format. Datadog uses this format for timeline alignment in Log Explorer. Datadog indexes logs submitted without a timestamp using the ingest time.

Response codes

202 accepted

A 202 Accepted response means Datadog received the payload. This response has no body. The log appears in Log Explorer within a few seconds.

HTTP/1.1 202 Accepted

400 bad request

Datadog returns this code when a payload has formatting errors. Common causes:

  • Incorrect JSON syntax.
  • The message field is missing.
  • The payload exceeds the 5 MB limit.
{
"errors": ["Invalid JSON"]
}

401 unauthorized

The API key is missing, incorrect, or lacks log ingestion permissions.

{
"errors": ["Forbidden"]
}

429 too many requests

The system exceeded the request rate limit. Reduce request frequency or batch log objects into a single array payload.

Curl example

export DD_API_KEY="your_datadog_api_key_here"
export DD_SITE="datadoghq.com"

curl -X POST "https://http-intake.logs.$DD_SITE/api/v2/logs" \
-H "DD-API-KEY: $DD_API_KEY" \
-H "Content-Type: application/json" \
-d '[
{
"message": "Transaction failed: Gateway timeout",
"ddsource": "payment-gateway",
"ddtags": "env:prod,region:us-east-1",
"hostname": "payments-host-01",
"service": "payment-gateway",
"level": "ERROR",
"transaction_id": "txn_998877"
}
]'

Batching multiple log events

You can send up to 1,000 log entries in a single request by passing an array. Datadog recommends this approach for high-throughput services.

[
{
"message": "User login succeeded",
"service": "auth-service",
"ddsource": "python",
"ddtags": "env:prod",
"level": "INFO"
},
{
"message": "Transaction failed: Gateway timeout",
"service": "payment-gateway",
"ddsource": "python",
"ddtags": "env:prod",
"level": "ERROR",
"transaction_id": "txn_998877"
}
]

Limits:

  • Max payload size: 5 MB per request
  • Max individual log size: 1 MB
  • Max array entries: 1,000 log objects

Common use cases

  • Ingesting logs from microservices without deploying the Datadog Agent.
  • Sending structured JSON events from serverless functions such as Amazon Web Services (AWS) Lambda or Google Cloud Platform (GCP) Cloud Run.
  • Streaming telemetry from IoT devices or edge services.
  • Routing enriched events to security information and event management (SIEM), S3, or alerting destinations via Log Pipelines.
  • Submitting logs from continuous integration and continuous delivery (CI/CD) pipelines or deployment scripts.

Troubleshooting

IssueProbable causeSolution
401 UnauthorizedIncorrect or missing API keyVerify DD_API_KEY in Org SettingsAPI Keys
400 Bad RequestMalformed JSONValidate with jq . payload.json before sending
Log not in ExplorerPipeline filter excluding logClear filters; check index routing rules
Timestamp out of orderNon-UTC or non-ISO 8601 formatUse "2025-11-15T08:30:00Z" format
429 Too Many RequestsRate limit exceededBatch log objects into a single array payload

Next steps