Routing Cloud App Logs Into a Pipeline
This guide walks you through the "Hello World" of observability pipelines: sending a single JSON log event from a cloud app (simulated via API) and verifying it in the platform.
Goal: Ingest a mock app log, verify it reached the pipeline, and prepare it for routing to a destination (like Splunk, Elastic, or S3).
Why this matters
Before you build complex parsing rules or route data to expensive SIEMs, you need to trust your ingestion path.
This workflow confirms:
- Your API token works.
- The pipeline accepts your JSON structure.
- You can see the data in real time.
Before you start
Ensure you have:
- An API Token (with
ingestpermissions). - Your unique Stream ID (from the platform UI).
- Terminal access (to run
curl) or Postman.
Pipeline overview
- Mermaid (code)
- Mermaid (image)
- ASCII
flowchart LR
A[Cloud App] -->|JSON Log| B[Ingest API]
B --> C{Routing Engine}
C -->|Match Rule| D[Live Data Viewer]
C -->|No Match| E[Drop / Default Bucket]
[Your App]
|
|
|
(POST /v1/streams/ingest)
v
+-----------------------+
| Ingestion Node |
+-----------------------+
|
v
+-----------------------+
| Live Data Viewer | <-- We are verifying here
+-----------------------+
1. Create or select a stream
In your observability platform:
- Go to Streams > Management.
- Click New Stream (or select
default-logs). - Copy the Stream ID (e.g.,
st_12345). - (Optional) Set a retention policy (default is usually 7 days).
2. Create a sample log event
We will simulate a cloud app log. Save this as app-log.json.
{
"timestamp": "2025-11-15T08:30:00Z",
"service": "payment-gateway",
"level": "ERROR",
"message": "Transaction failed: Gateway timeout",
"transaction_id": "txn_998877",
"meta": {
"region": "us-east-1",
"customer_id": "cus_554433"
}
}
Note: Keep the
timestampin ISO 8601 format to ensure proper indexing.
3. Send the log event
Use curl to simulate the app sending the log.
export STREAM_ID="your_stream_id_here"
export API_TOKEN="your_api_token_here"
curl -X POST "https://api.observability-platform.com/v1/streams/ingest" \
-H "Authorization: Bearer $API_TOKEN" \
-H "Content-Type: application/json" \
-H "X-Stream-Id: $STREAM_ID" \
-d @app-log.json
Expected response
{
"status": "accepted",
"ingest_id": "evt_abc12345",
"accepted_items": 1
}
If you see status: accepted, the pipeline received the data.
4. Verify your log in the live data viewer
- Navigate to Explore or Live Tail in the platform UI.
- Select your stream (
st_12345). - Add a filter:
service == "payment-gateway". - You should see your event appear instantly:
[ERROR] 2025-11-15T08:30:00Z service=payment-gateway msg="Transaction failed..."
5. Route logs to a destination
Now that data is flowing, create a routing rule:
- Go to Pipelines > Routing Rules.
- Click New Rule.
- Filter:
level == "ERROR" - Action: Route to
S3-ArchiveANDSlack-Alerts. - Save & Deploy.
Send the curl request again.
- Check your S3 bucket.
- Check your Slack channel.
6. Optimize the pipeline
Now that you have a working flow, consider adding:
- Parsing: Extract
customer_idinto a top-level field. - Enrichment: Add a
team_ownertag based on theservicename. - Masking: Obfuscate
customer_idif it is PII.
Troubleshooting
- 401 Unauthorized: Check your API Token permissions.
- 400 Bad Request: Validate your JSON syntax.
- 404 Not Found: Verify the
X-Stream-Idis correct. - Timestamp issues: Ensure you are using UTC ISO 8601 format.
- Data not showing: Check if you have a filter active that excludes your new log.
Next steps
- Ingest API Reference
- Observability Concepts
- Try sending different log types (VM logs, container logs, API events).