---
title: Datadog Logs Destination
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > Observability Pipelines > Destinations > Datadog Logs Destination
---

# Datadog Logs Destination

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com

{% alert level="danger" %}
This product is not supported for your selected [Datadog site](https://docs.datadoghq.com/getting_started/site). ().
{% /alert %}

{% /callout %}
Available for:
{% icon name="icon-logs" /%}
 Logs 
Use Observability Pipelines' Datadog Logs destination to send logs to Datadog Log Management. You can also use AWS PrivateLink to send logs from Observability Pipelines to Datadog.

## Setup{% #setup %}

Set up the Datadog Logs destination and its environment variables when you [set up a pipeline](https://app.datadoghq.com/observability-pipelines). The information below is configured in the pipelines UI.

### Set up the destination{% #set-up-the-destination %}

There are no required setup steps.

#### Optional settings{% #optional-settings %}

##### Route logs to multiple Datadog organizations{% #route-logs-to-multiple-datadog-organizations %}

You can route logs to multiple Datadog organizations. After routing has been set up, you can view metrics for the component or specific organizations to which you are routing logs.

**Note**: You can route up to 100 Datadog organizations.

{% image
   source="https://datadog-docs.imgix.net/images/observability_pipelines/destinations/multi_dd_orgs.2fe334630dd081475204a762bd6b892b.png?auto=format"
   alt="The Datadog Logs destination showing us1 and us3 org" /%}

Click **Route to Multiple Organizations** to set up routing to multiple Datadog organizations.

- If you haven't added any organizations yet, enter organization details as described in the Add a Datadog organization section.
- If you have already added organizations, you can:
  - Click on an organization in the table to edit or delete it.
  - Use the search bar to find a specific organization by name, filter query, or Datadog site, and then select the organization to edit or delete it.
  - View metrics for an organization.
  - Click **Add organization** to route to another Datadog organization.

**Note**: If you don't set up routing to multiple Datadog organizations, logs are routed to the default Datadog organization, which is the organization that is tied to the API key when you install the Worker.

##### Add an organization{% #add-an-organization %}

{% alert level="warning" %}
Logs that do not match any of the organization filters are dropped. The component metric `Data dropped (intentional)` shows the number logs that do not match the filters and are dropped.
{% /alert %}

1. Enter a name for the organization.
   - **Note**: The name does not have to correspond to the actual name of the Datadog organization.
1. Define a filter query. Only logs that match the specified filter query are sent to the organization. See [Observability Pipelines Search Syntax](https://docs.datadoghq.com/observability_pipelines/search_syntax/logs/) for more information on writing filter queries.
1. Select the Datadog organization's site.
1. Enter the identifier for the API key for that Datadog organization.
   - **Note**: Only enter the identifier for the API key. Do **not** enter the actual API key.
1. Click **Save**.

##### Buffering{% #buffering %}

Toggle the switch to enable **Buffering Options**. Enable a configurable buffer on your destination to ensure intermittent latency or an outage at the destination doesn't create immediate backpressure, and allow events to continue to be ingested from your source. Disk buffers can also increase pipeline durability by writing data to disk, ensuring buffered data persists through a Worker restart. See [Destination buffers](https://docs.datadoghq.com/observability_pipelines/scaling_and_performance/buffering_and_backpressure/#destination-buffers) for more information.

- If left unconfigured, your destination uses a memory buffer with a capacity of 500 events.
- To configure a buffer on your destination:
  1. Select the buffer type you want to set (**Memory** or **Disk**).
  1. Enter the buffer size and select the unit.
     1. Maximum memory buffer size is 128 GB.
     1. Maximum disk buffer size is 500 GB.
  1. In the **Behavior on full buffer** dropdown menu, select whether you want to **block** events or **drop new events** when the buffer is full.

### Set secrets{% #set-secrets %}

**Note**: If you entered identifiers for yours secrets and then choose to use environment variables, the environment variable is the identifier entered prepended with `DD_OP`. For example, if you entered `PASSWORD_1` for the a password identifier, the environment variable for the password is `DD_OP_PASSWORD_1`.

{% tab title="Secrets Management" %}
There are no secret identifiers for this destination.
{% /tab %}

{% tab title="Environment Variables" %}
No environment variables required.
{% /tab %}

## View metrics for the component or specific organizations{% #view-metrics-for-the-component-or-specific-organizations %}

You can view metrics at the component level or organization level.

### Component-level metrics{% #component-level-metrics %}

To view metrics for the overall Datadog Logs destination:

1. Navigate to [Observability Pipelines](https://app.datadoghq.com/observability-pipelines).
1. Select your pipeline.
1. Click the cog on the **Datadog Logs** destination and select **View details**.

**Note**: The **Data dropped (intentional)** metric shows logs that didn't match any of the organizations' filters.

### Organization-level metrics{% #organization-level-metrics %}

To view metrics for a specific Datadog organization:

1. Navigate to [Observability Pipelines](https://app.datadoghq.com/observability-pipelines).
1. Select your pipeline.
1. Click the **Datadog Logs** destination so the organizations show up.
   {% image
      source="https://datadog-docs.imgix.net/images/observability_pipelines/destinations/multi_dd_orgs_highlighted.efa6887fa371d1b306761a6c3a5a1558.png?auto=format"
      alt="The Datadog Logs destination showing us1 and us3 org highlighted" /%}
1. Click the organization you want to see metrics for.
1. Click **View Health Metrics**.

Alternatively, you can click on **Review Configured Organizations** in the Datadog Logs destination, and click the graph icon in the **Metrics** column for the organization you are interested in.

## How the destination works{% #how-the-destination-works %}

### Event batching{% #event-batching %}

A batch of events is flushed when one of these parameters is met. See [event batching](https://docs.datadoghq.com/observability_pipelines/destinations/#event-batching) for more information.

| Maximum Events | Maximum Size (MB) | Timeout (seconds) |
| -------------- | ----------------- | ----------------- |
| 1,000          | 4.25              | 5                 |



{% callout %}
# Important note for users on the following Datadog sites: app.datadoghq.com, ap1.datadoghq.com, ap2.datadoghq.com



## AWS PrivateLink{% #aws-privatelink %}

To send logs from Observability Pipelines to Datadog using AWS PrivateLink, see [Connect to Datadog over AWS PrivateLink](https://docs.datadoghq.com/agent/guide/private-link/?tab=crossregionprivatelinkendpoints) for setup instructions. The two endpoints you need to set up are:

- Logs (User HTTP intake):
- Remote Configuration:

**Note**: The `obpipeline-intake.datadoghq.com` endpoint is used for Live Capture and is not available as a PrivateLink endpoint.


{% /callout %}

{% callout %}
# Important note for users on the following Datadog sites: us3.datadoghq.com



## Azure Private Link{% #azure-private-link %}

To send logs from Observability Pipelines to Datadog using Azure Private Link, see [Connect to Datadog over Azure Private Link](https://docs.datadoghq.com/agent/guide/azure-private-link/?site=us3) for setup instructions. The two endpoints you need to set up are:

- Logs (User HTTP intake): `http-intake.logs.us3.datadoghq.com`
- Remote Configuration: `config.us3.datadoghq.com`

**Note**: The `obpipeline-intake.datadoghq.com` endpoint is used for Live Capture and is not available as a Private Link endpoint.


{% /callout %}


