- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
Log Forwarding allows you to send logs from Datadog to custom destinations like Splunk, Elasticsearch, and HTTP endpoints. This means that you can use Log Pipelines to centrally collect, process, and standardize your logs in Datadog. Then, send the logs from Datadog to other tools to support individual teams’ workflows. You can choose to forward any of the ingested logs, whether or not they are indexed, to custom destinations. Logs are forwarded in JSON format and compressed with GZIP.
Note: Only Datadog users with the logs_write_forwarding_rules
permission can create, edit, and delete custom destinations for forwarding logs.
If a forwarding attempt fails (for example: if your destination temporarily becomes unavailable), Datadog retries periodically for 2 hours using an exponential backoff strategy. The first attempt is made following a 1-minute delay. For subsequent retries, the delay increases progressively to a maximum of 8-12 minutes (10 minutes with 20% variance).
The following metrics report on logs that have been forwarded successfully, including logs that were sent successfully after retries, as well as logs that were dropped.
https://
.Authentication Type | Description | Example |
---|---|---|
Basic Authentication | Provide the username and password for the account to which you want to send logs. | Username: myaccount Password: mypassword |
Request Header | Provide the header name and value. Example for Authorization: - Enter Authorization for Header Name.- Use a header value formatted as Basic username:password , encoded in base64. | Header Name: Authorization Header Value: Basic bXlhY2NvdW50Om15cGFzc3dvcmQ= |
https://
. For example, enter https://<your_account>.splunkcloud.com:8088
./services/collector/event
is automatically appended to the endpoint.Setting | Description | Example |
---|---|---|
Endpoint | Enter the endpoint to which you want to send the logs. The endpoint must start with https:// . | https://<your_account>.us-central1.gcp.cloud.es.io (Elasticsearch) |
Destination Index Name | Specify the name of the destination index where you want to send the logs. | your_index_name |
Index Rotation | Optionally, select how often to create a new index: No Rotation , Every Hour , Every Day , Every Week , Every Month . The default is No Rotation . | Every Day |
Setting | Description | Example |
---|---|---|
Logs Ingestion Endpoint | Enter the endpoint on the Data Collection Endpoint (DCE) where logs are sent. This is labeled “Logs Ingestion” on the DCE Overview page. | https://my-dce-5kyl.eastus-1.ingest.monitor.azure.com |
Immutable ID | Specify the immutable ID of the Data Collection Rule (DCR) where logging routes are defined, as found on the DCR Overview page as “Immutable Id”. Note: Ensure the Monitoring Metrics Publisher role is assigned in the DCR IAM settings. | dcr-000a00a000a00000a000000aa000a0aa |
Stream Declaration Name | Provide the name of the target Stream Declaration found in the Resource JSON of the DCR under streamDeclarations . | Custom-MyTable |
On the Log Forwarding page, hover over the status for a destination to see the percentage of logs that matched the filter criteria and have been forwarded in the past hour.