Forwarding Logs to Custom Destinations

이 페이지는 아직 한국어로 제공되지 않으며 번역 작업 중입니다. 번역에 관한 질문이나 의견이 있으시면 언제든지 저희에게 연락해 주십시오.

Log forwarding is not available for the Government site. Contact your account representative for more information.

Overview

Log Forwarding allows you to send logs from Datadog to custom destinations like Splunk, Elasticsearch, and HTTP endpoints. This means that you can use Log Pipelines to centrally collect, process, and standardize your logs in Datadog. Then, send the logs from Datadog to other tools to support individual teams’ workflows. You can choose to forward any of the ingested logs, whether or not they are indexed, to custom destinations. Logs are forwarded in JSON format and compressed with GZIP.

Note: Only Datadog users with the logs_write_forwarding_rules permission can create, edit, and delete custom destinations for forwarding logs.

The Log Forwarding page, showing custom destinations highlighted. The list of destinations includes Splunk (filtered by service:logs-processing), HTTP Endpoint (filtered by source:okta OR source:paloalto), and Elasticsearch (filtered by team:acme env:prod).

If a forwarding attempt fails (for example: if your destination temporarily becomes unavailable), Datadog retries periodically for 2 hours using an exponential backoff strategy. The first attempt is made following a 1-minute delay. For subsequent retries, the delay increases progressively to a maximum of 8-12 minutes (10 minutes with 20% variance).

The following metrics report on logs that have been forwarded successfully, including logs that were sent successfully after retries, as well as logs that were dropped.

  • datadog.forwarding.logs.bytes
  • datadog.forwarding.logs.count

Set up log forwarding to custom destinations

  1. Add webhook IPs from the IP ranges list to the allowlist.
  2. Navigate to Log Forwarding.
  3. Select Custom Destinations.
  4. Click New Destination.
  5. Enter the query to filter your logs for forwarding. See Search Syntax for more information.
  6. Select the Destination Type.
The destination configuration page, showing the steps to set up a new destination.
  1. Enter a name for the destination.
  2. In the Define endpoint field, enter the endpoint to which you want to send the logs. The endpoint must start with https://.
  3. In the Configure Authentication section, select one of the following authentication types and provide the relevant details:
    • Basic Authentication: Provide the username and password for the account to which you want to send logs.
    • Request Header: Provide the header name and value. For example, if you use the Authorization header and the username for the account to which you want to send logs is myaccount and the password is mypassword:
      • Enter Authorization for the Header Name.
      • The header value is in the format of Basic username:password, where username:password is encoded in base64. For this example, the header value is Basic bXlhY2NvdW50Om15cGFzc3dvcmQ=.
  1. Enter a name for the destination.
  2. In the Configure Destination section, enter the endpoint to which you want to send the logs. The endpoint must start with https://. For example, enter https://<your_account>.splunkcloud.com:8088.
    Note: /services/collector/event is automatically appended to the endpoint.
  3. In the Configure Authentication section, enter the Splunk HEC token. See Set up and use HTTP Event Collector for more information about the Splunk HEC token.
    Note: The indexer acknowledgment needs to be disabled.
  1. Enter a name for the destination.
  2. In the Configure Destination section, enter the following details:
    a. The endpoint to which you want to send the logs. The endpoint must start with https://. An example endpoint for Elasticsearch: https://<your_account>.us-central1.gcp.cloud.es.io.
    b. The name of the destination index where you want to send the logs.
    c. Optionally, select the index rotation for how often you want to create a new index: No Rotation, Every Hour, Every Day, Every Week, or Every Month. The default is No Rotation.
  3. In the Configure Authentication section, enter the username and password for your Elasticsearch account.
  1. In the Select Tags to Forward section:
    a. Select whether you want All tags, No tags, or Specific Tags to be included.
    b. Select whether you want to Include or Exclude specific tags, and specify which tags to include or exclude.
  2. Click Save.

On the Log Forwarding page, hover over the status for a destination to see the percentage of logs that matched the filter criteria and have been forwarded in the past hour.

Edit a destination

  1. Navigate to Log Forwarding.
  2. Select Custom Destinations to view a list of all existing destinations.
  3. Click the Edit button for the destination you want to edit.
  4. Make the changes on the configuration page.
  5. Click Save.

Delete a destination

  1. Navigate to Log Forwarding.
  2. Select Custom Destinations to view a list of all existing destinations.
  3. Click the Delete button for the destination that you want to delete, and click Confirm. This removes the destination from the configured list of destinations and logs are no longer forwarded to it.

Further reading