---
title: Forwarding Logs to Custom Destinations
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: >-
  Docs > Log Management > Log Configuration > Forwarding Logs to Custom
  Destinations
---

# Forwarding Logs to Custom Destinations

## Overview{% #overview %}

Log Forwarding allows you to send logs from Datadog to custom destinations like Splunk, Elasticsearch, and HTTP endpoints. This means that you can use [Log Pipelines](https://docs.datadoghq.com/logs/log_configuration/pipelines/) to centrally collect, process, and standardize your logs in Datadog. Then, send the logs from Datadog to other tools to support individual teams' workflows. You can choose to forward any of the ingested logs, whether or not they are indexed, to custom destinations. Logs are forwarded in JSON format and compressed with GZIP by default.

**Note**: Only Datadog users with the [`logs_write_forwarding_rules`](https://docs.datadoghq.com/account_management/rbac/permissions/?tab=ui#log-management) permission can [create](https://docs.datadoghq.com/logs/log_configuration/forwarding_custom_destinations#set-up-log-forwarding-to-custom-destinations), [edit](https://docs.datadoghq.com/logs/log_configuration/forwarding_custom_destinations#edit-a-destination), and [delete](https://docs.datadoghq.com/logs/log_configuration/forwarding_custom_destinations#delete-a-destination) custom destinations for forwarding logs.

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/forwarding/forwarding_page.c67128998cf3c7191cdfc494db6e9da8.png?auto=format"
   alt="The Log Forwarding page, showing custom destinations highlighted. The list of destinations includes Splunk (filtered by service:logs-processing), HTTP Endpoint (filtered by source:okta OR source:paloalto), and Elasticsearch (filtered by team:acme env:prod)." /%}

If a forwarding attempt fails (for example: if your destination temporarily becomes unavailable), Datadog retries periodically for 2 hours using an exponential backoff strategy. The first attempt is made following a 1-minute delay. For subsequent retries, the delay increases progressively to a maximum of 8-12 minutes (10 minutes with 20% variance).

The following metrics report on logs that have been forwarded successfully, including logs that were sent successfully after retries, as well as logs that were dropped.

- datadog.forwarding.logs.bytes
- datadog.forwarding.logs.count

## Set up log forwarding to custom destinations{% #set-up-log-forwarding-to-custom-destinations %}

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com



{% alert level="danger" %}
Sending logs to a custom destination is outside of the Datadog GovCloud environment, which is outside the control of Datadog. Datadog shall not be responsible for any logs that have left the Datadog GovCloud environment, including without limitation, any obligations or requirements that the user may have related to FedRAMP, DoD Impact Levels, ITAR, export compliance, data residency or similar regulations applicable to such logs.Due to security protocols for the US1-FED site, only port 443 and 8088 are open for log forwarding. To use a different port, contact [Datadog Support](https://www.datadoghq.com/support/).
{% /alert %}


{% /callout %}

1. Add webhook IPs from the IP ranges list to the allowlist.
1. Navigate to [Log Archiving & Forwarding](https://app.datadoghq.com/logs/pipelines/log-forwarding/custom-destinations).
1. Select **Custom Destinations**.
1. Click **New Destination**.
1. Enter the query to filter your logs for forwarding. See [Search Syntax](https://docs.datadoghq.com/logs/explorer/search_syntax/) for more information.
1. Select the **Destination Type**.

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/forwarding/log-forwarding-gzip-opt-out.aed7e51795e3fb4677d3419e2be9820c.png?auto=format"
   alt="The destination configuration page, showing the steps to set up a new destination." /%}

{% tab title="HTTP" %}
Enter a name for the destination.In the **Define endpoint** field, enter the endpoint to which you want to send the logs. The endpoint must start with `https://`.
- For example, if you want to send logs to Sumo Logic, follow their [Configure HTTP Source for Logs and Metrics documentation](https://help.sumologic.com/docs/send-data/hosted-collectors/http-source/logs-metrics/) to get the HTTP Source Address URL to send data to their collector. Enter the HTTP Source Address URL in the **Define endpoint** field.
(Optional) Disable GZIP compression if your HTTP endpoint does not support compressed payloads.In the **Configure Authentication** section, select one of the following authentication types and provide the relevant details:
| Authentication Type      | Description                                                                                                                                                                              | Example                                                                        |
| ------------------------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------ |
| **Basic Authentication** | Provide the username and password for the account to which you want to send logs.                                                                                                        | Username: `myaccount`Password: `mypassword`                                    |
| **Request Header**       | Provide the header name and value. Example for Authorization:- Enter `Authorization` for **Header Name**.- Use a header value formatted as `Basic username:password`, encoded in base64. | Header Name: `Authorization`Header Value: `Basic bXlhY2NvdW50Om15cGFzc3dvcmQ=` |

{% /tab %}

{% tab title="Splunk" %}
Enter a name for the destination.In the **Configure Destination** section, enter the endpoint to which you want to send the logs. The endpoint must start with `https://`. For example, enter `https://<your_account>.splunkcloud.com:8088`.**Note**: `/services/collector/event` is automatically appended to the endpoint.In the **Configure Authentication** section, enter the Splunk HEC token. See [Set up and use HTTP Event Collector](https://docs.splunk.com/Documentation/Splunk/9.0.1/Data/UsetheHTTPEventCollector) for more information about the Splunk HEC token.**Note**: The [indexer acknowledgment](https://docs.splunk.com/Documentation/Splunk/9.0.3/Data/AboutHECIDXAck) needs to be disabled.
{% /tab %}

{% tab title="Elasticsearch" %}
Enter a name for the destination.In the **Configure Destination** section, enter the following details:
| Setting                    | Description                                                                                                                                              | Example                                                              |
| -------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------- |
| **Endpoint**               | Enter the endpoint to which you want to send the logs. The endpoint must start with `https://`.                                                          | `https://<your_account>.us-central1.gcp.cloud.es.io` (Elasticsearch) |
| **Destination Index Name** | Specify the name of the destination index where you want to send the logs.                                                                               | `your_index_name`                                                    |
| **Index Rotation**         | Optionally, select how often to create a new index: `No Rotation`, `Every Hour`, `Every Day`, `Every Week`, `Every Month`. The default is `No Rotation`. | `Every Day`                                                          |
In the **Configure Authentication** section, enter the username and password for your Elasticsearch account.
{% /tab %}

{% tab title="Microsoft Sentinel" %}
Enter a name for the destination.Authentication for the Microsoft Sentinel Forwarder requires configuring an App Registration through the Datadog Azure Integration.In the **Configure Destination** section, enter the following details:
| Setting                     | Description                                                                                                                                                                                                                                   | Example                                                 |
| --------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------- |
| **Logs Ingestion Endpoint** | Enter the endpoint on the Data Collection Endpoint (DCE) where logs are sent. This is labeled "Logs Ingestion" on the DCE Overview page.                                                                                                      | `https://my-dce-5kyl.eastus-1.ingest.monitor.azure.com` |
| **Immutable ID**            | Specify the immutable ID of the Data Collection Rule (DCR) where logging routes are defined, as found on the DCR Overview page as "Immutable Id". **Note**: Ensure the Monitoring Metrics Publisher role is assigned in the DCR IAM settings. | `dcr-000a00a000a00000a000000aa000a0aa`                  |
| **Stream Declaration Name** | Provide the name of the target Stream Declaration found in the Resource JSON of the DCR under `streamDeclarations`.                                                                                                                           | `Custom-MyTable`                                        |

{% /tab %}

{% tab title="Google SecOps (Chronicle)" %}

{% alert level="info" %}
Preview available: You can send logs to Google SecOps (Chronicle) from Datadog [Register for the Preview](https://www.datadoghq.com/product-preview/log-forwarding-to-google-chronicle/).
{% /alert %}

Enter a name for the destination.

Authentication for the Google Chronicle Forwarder requires using a GCP Service Account with Chronicle write access.

In the **Configure Destination** section, enter the following details:

| Setting               | Description                                                                                                                                                | Example                               |
| --------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------- |
| **Customer ID**       | The Chronicle customer ID provided by Google.                                                                                                              | `abcd1234`                            |
| **Regional Endpoint** | The Chronicle ingestion API endpoint URL based on your region. **Note**: Ensure the Monitoring Metrics Publisher role is assigned in the DCR IAM settings. | `https://us.chronicle.googleapis.com` |
| **Namespace**         | The namespace in which your Chronicle logs should be ingested.                                                                                             | `default`                             |

In the **Configure authentication settings** section, enter the following details:

| Setting            | Description                                                      | Example                                                             |
| ------------------ | ---------------------------------------------------------------- | ------------------------------------------------------------------- |
| **Project ID**     | The GCP project ID associated with the Chronicle instance.       | `my-gcp-chronicle-project`                                          |
| **Private Key ID** | The ID of the private key from your service account credentials. | `0123456789abcdef`                                                  |
| **Private Key**    | The private key from your service account credentials.           | `-----BEGIN PRIVATE KEY-----\nMIIE...`                              |
| **Client Email**   | The email address of the service account.                        | `chronicle-writer@my-gcp-chronicle-project.iam.gserviceaccount.com` |
| **Client ID**      | The client ID from your service account credentials.             | `123456789012345678901`                                             |

{% /tab %}
In the **Select Tags to Forward** section:
1. Select whether you want **All tags**, **No tags**, or **Specific Tags** to be included.
1. Select whether you want to **Include** or **Exclude specific tags**, and specify which tags to include or exclude.
Click **Save**.
On the [Log Forwarding](https://app.datadoghq.com/logs/pipelines/log-forwarding/custom-destinations) page, hover over the status for a destination to see the percentage of logs that matched the filter criteria and have been forwarded in the past hour.

## Edit a destination{% #edit-a-destination %}

1. Navigate to [Log Forwarding](https://app.datadoghq.com/logs/pipelines/log-forwarding/custom-destinations).
1. Select **Custom Destinations** to view a list of all existing destinations.
1. Click the **Edit** button for the destination you want to edit.
1. Make the changes on the configuration page.
1. Click **Save**.

## Delete a destination{% #delete-a-destination %}

1. Navigate to [Log Forwarding](https://app.datadoghq.com/logs/pipelines/log-forwarding/custom-destinations).
1. Select **Custom Destinations** to view a list of all existing destinations.
1. Click the **Delete** button for the destination that you want to delete, and click **Confirm**. This removes the destination from the configured list of destinations and logs are no longer forwarded to it.

## Further reading{% #further-reading %}

- [Route logs to third-party systems with Datadog Log Forwarding](https://www.datadoghq.com/blog/route-logs-with-datadog-log-forwarding/)
- [Start collecting your logs](https://docs.datadoghq.com/logs/log_collection)
- [Learn about log pipelines](https://docs.datadoghq.com/logs/log_configuration/pipelines)
- [Forward logs directly from your environment with Observability Pipelines](https://docs.datadoghq.com/observability_pipelines/)
- [Centrally process and govern your logs in Datadog before sending them to Microsoft Sentinel](https://www.datadoghq.com/blog/microsoft-sentinel-logs/)
- [Forward security signals, spans, and other event types to custom destinations](https://docs.datadoghq.com/security/events_forwarding)
