---
title: Set Up Pipelines
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > Observability Pipelines > Configuration > Set Up Pipelines
---

# Set Up Pipelines

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com

{% alert level="danger" %}
This product is not supported for your selected [Datadog site](https://docs.datadoghq.com/getting_started/site.md). ().
{% /alert %}

{% /callout %}

## Overview{% #overview %}

{% alert level="info" %}
The pipelines and processors outlined in this documentation are specific to on-premises logging environments. To aggregate, process, and route cloud-based logs, see [Log Management Pipelines](https://docs.datadoghq.com/logs/log_configuration/pipelines.md?tab=source).
{% /alert %}

In Observability Pipelines, a pipeline is a sequential path with three types of components:

- [Source](https://docs.datadoghq.com/observability_pipelines/sources.md): Receives data from your data source (for example, the Datadog Agent).
- [Processors](https://docs.datadoghq.com/observability_pipelines/processors.md): Enrich and transform your data.
- [Destinations](https://docs.datadoghq.com/observability_pipelines/destinations.md): Where your processed data is sent.

{% image
   source="https://docs.dd-static.net/images/observability_pipelines/archive_log_pipeline.4ec6cfc248e84c55b569425ac3328fd5.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/observability_pipelines/archive_log_pipeline.4ec6cfc248e84c55b569425ac3328fd5.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="Pipeline with one source connected to two processor groups and two destinations" /%}

You can create a pipeline with one of the following methods:

- Pipeline UI
- API
- Terraform

See [Export a Pipeline Configuration to JSON or Terraform](https://docs.datadoghq.com/observability_pipelines/configuration/export_pipeline_configuration.md) if you want to programmatically deploy a pipeline created in the UI.

## Set up a pipeline in the UI{% #set-up-a-pipeline-in-the-ui %}

### Set up pipeline components{% #set-up-pipeline-components %}

{% tab title="Logs" %}

1. Navigate to [Observability Pipelines](https://app.datadoghq.com/observability-pipelines).
1. Select a [template](https://docs.datadoghq.com/observability_pipelines/configuration/explore_templates.md) based on your use case.
1. Select and set up your [source](https://docs.datadoghq.com/observability_pipelines/sources.md).
1. Add [processors](https://docs.datadoghq.com/observability_pipelines/processors.md) to transform, redact, and enrich your log data. **Note**: For a pipeline canvas, there is a limit of 25 processors groups and a total of 150 processors.
   - If you want to copy a processor, click the copy icon for that processor and then use `command-v` to paste it.
1. Select and set up [destinations](https://docs.datadoghq.com/observability_pipelines/destinations.md) for your processed logs.

#### Add or remove components{% #add-or-remove-components %}

##### Add another processor group{% #add-another-processor-group %}

{% image
   source="https://docs.dd-static.net/images/observability_pipelines/setup/another_processor_group.ecdee1d75efe73107cd0cd628052da1f.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/observability_pipelines/setup/another_processor_group.ecdee1d75efe73107cd0cd628052da1f.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The Pipelines page showing two processor groups sending logs to the same destination" /%}

If you want to add another group of processors for a destination:

1. Click the plus sign (**+**) at the bottom of the existing processor group.
1. Click the name of the processor group to update it.
1. Optionally, enter a group filter. Only data that match the filter query are sent through the processors in the group. See [Search Syntax](https://docs.datadoghq.com/observability_pipelines/search_syntax/logs.md) for more information.
1. Click **Add** to add processors to the group.
1. If you want to copy all processors in a group and paste them into the same processor group or a different group:
   1. Click the three dots on the processor group.
   1. Select **Copy all processors**.
   1. Select the desired processor group, and then paste the processors into it.
1. You can toggle the switch to enable and disable the processor group and also each individual processor.

**Notes**:

- There is a limit of 25 processor groups for a pipeline canvas.

##### Add another set of processors and destinations{% #add-another-set-of-processors-and-destinations %}

{% image
   source="https://docs.dd-static.net/images/observability_pipelines/setup/another_set_processor_destination.c9a4b6845d6091f1ccf16cd04eaaee35.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/observability_pipelines/setup/another_set_processor_destination.c9a4b6845d6091f1ccf16cd04eaaee35.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The Pipelines page showing two processor groups sending logs to two different destinations" /%}

If you want to add another set of processors and destinations, click the plus sign (**+**) to the left of the processor group to add another set of processors and destinations to the source.

To delete a processor group, you need to delete all destinations linked to that processor group. When the last destination is deleted, the processor group is removed with it.

##### Add another destination to a processor group{% #add-another-destination-to-a-processor-group %}

{% image
   source="https://docs.dd-static.net/images/observability_pipelines/setup/another_destination.93229910fa7394f93c6c6ff68aa42fd2.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/observability_pipelines/setup/another_destination.93229910fa7394f93c6c6ff68aa42fd2.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The Pipelines page showing one processor group sending logs to two different destinations" /%}

If you want to add an additional destination to a processor group, click the plus sign (**+**) to the right of the processor group.

To delete a destination, click on the pencil icon to the top right of the destination, and select **Delete node**.

- If you delete a destination from a processor group that has multiple destinations, only the deleted destination is removed.
- If you delete a destination from a processor group that only has one destination, both the destination and the processor group are removed.

**Notes**:

- A pipeline must have at least one destination. If a processor group only has one destination, that destination cannot be deleted.
- You can add a total of three destinations for a pipeline.
- A specific destination can only be added once. For example, you cannot add multiple Splunk HEC destinations.

{% /tab %}

{% tab title="Metrics" %}

{% alert level="info" %}
Metric Tag Governance is in Preview. Fill out the [form](https://www.datadoghq.com/product-preview/metrics-ingestion-and-cardinality-control-in-observability-pipelines/) to request access.
{% /alert %}

1. Navigate to [Observability Pipelines](https://app.datadoghq.com/observability-pipelines).
1. Select the [Metric Tag Governance](https://docs.datadoghq.com/observability_pipelines/configuration/explore_templates.md?tab=metrics#metric-tag-governance) template.
1. Set up the [Datadog Agent](https://docs.datadoghq.com/observability_pipelines/sources/datadog_agent.md?tab=metrics) source.
1. Add [processors](https://docs.datadoghq.com/observability_pipelines/processors.md) to filter and transform your metrics. **Note**: For a pipeline canvas, there is a limit of 25 processors groups and a total of 150 processors.
   - If you want to copy a processor, click the copy icon for that processor and then paste it (`Cmd+V` on Mac, `Ctrl+V` on Windows/Linux).
1. Set up the [Datadog Metrics](https://docs.datadoghq.com/observability_pipelines/destinations/datadog_metrics.md) destination.

#### Add another processor group{% #add-another-processor-group %}

{% image
   source="https://docs.dd-static.net/images/observability_pipelines/setup/another_processor_group_metrics.c25cab1c21073c5c57e4889516100dc4.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/observability_pipelines/setup/another_processor_group_metrics.c25cab1c21073c5c57e4889516100dc4.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The Pipelines page showing two processor groups sending logs to the same destination" /%}

If you want to add another group of processors for a destination:

1. Click the plus sign (**+**) at the bottom of the existing processor group.
1. Click the name of the processor group to update it.
1. Optionally, enter a group filter. Only data that match the filter query are sent through the processors in the group. See [Search Syntax](https://docs.datadoghq.com/observability_pipelines/search_syntax/metrics.md) for more information.
1. Click **Add** to add processors to the group.
1. If you want to copy all processors in a group and paste them into the same processor group or a different group:
   1. Click the three dots on the processor group.
   1. Select **Copy all processors**.
   1. Select the desired processor group, and then paste the processors into it.
1. You can toggle the switch to enable and disable the processor group and also each individual processor.

**Notes**:

- There is a limit of 25 processor groups for a pipeline canvas.

{% /tab %}

### Install the Worker and deploy the pipeline{% #install-the-worker-and-deploy-the-pipeline %}

After you have set up your source, processors, and destinations, click **Next: Install**. See [Install the Worker](https://docs.datadoghq.com/observability_pipelines/configuration/install_the_worker.md) for instructions on how to install the Worker for your platform. See [Advanced Worker Configurations](https://docs.datadoghq.com/observability_pipelines/configuration/install_the_worker/advanced_worker_configurations.md) for bootstrapping options.

If you want to make changes to your pipeline after you have deployed it, see [Update Existing Pipelines](https://docs.datadoghq.com/observability_pipelines/configuration/update_existing_pipelines.md?).

### Enable out-of-the-box monitors for your pipeline{% #enable-out-of-the-box-monitors-for-your-pipeline %}

1. Navigate to the [Pipelines](https://app.datadoghq.com/observability-pipelines) page and find your pipeline.
1. Click **Enable monitors** in the **Monitors** column for your pipeline.
1. Click **Start** to set up a monitor for one of the suggested use cases.
   - The metric monitor is configured based on the selected use case. You can update the configuration to further customize it. See the [Metric monitor documentation](https://docs.datadoghq.com/monitors/types/metric.md) for more information.

## Set up a pipeline with the API{% #set-up-a-pipeline-with-the-api %}

1. Use the Observability Pipelines API to [create a pipeline](https://docs.datadoghq.com/api/latest/observability-pipelines.md#create-a-new-pipeline). See the API reference for example request payloads.

1. After creating the pipeline, [install the Worker](https://docs.datadoghq.com/observability_pipelines/configuration/install_the_worker.md?tab=docker#api-or-terraform-pipeline-setup) to send data through the pipeline.

   - See [Environment Variables](https://docs.datadoghq.com/observability_pipelines/guide/environment_variables.md) for the list of environment variables you need for the different sources, processor, and destinations when you install the Worker.

Use the [update a pipeline](https://docs.datadoghq.com/api/latest/observability-pipelines.md#update-a-pipeline) endpoint to make any changes to an existing pipeline.

See [Advanced Worker Configurations](https://docs.datadoghq.com/observability_pipelines/configuration/install_the_worker/advanced_worker_configurations.md) for bootstrapping options.

## Set up a pipeline with Terraform{% #set-up-a-pipeline-with-terraform %}

{% alert level="warning" %}
[Terraform 3.84.0](https://github.com/DataDog/terraform-provider-datadog/releases/tag/v3.84.0) replaces standalone processors with [processor groups](https://docs.datadoghq.com/observability_pipelines/processors.md#processor-groups) and is a breaking change. If you want to upgrade to Terraform 3.84.0, see the [PR description](https://github.com/DataDog/terraform-provider-datadog/pull/3346) for instructions on how to migrate your existing resources.
{% /alert %}

1. You can use the [datadog_observability_pipeline](https://registry.terraform.io/providers/DataDog/datadog/latest/docs) module to create a pipeline using Terraform.

1. After creating the pipeline, [install the Worker](https://docs.datadoghq.com/observability_pipelines/configuration/install_the_worker.md?tab=docker#api-or-terraform-pipeline-setup) to send data through the pipeline.

   - See [Environment Variables](https://docs.datadoghq.com/observability_pipelines/guide/environment_variables.md) for the list of environment variables you need for the different sources, processor, and destinations when you install the Worker.

Use the [datadog_observability_pipeline](https://registry.terraform.io/providers/DataDog/datadog/latest/docs) module to make any changes to an existing pipeline.

See [Advanced Worker Configurations](https://docs.datadoghq.com/observability_pipelines/configuration/install_the_worker/advanced_worker_configurations.md) for bootstrapping options.

## Clone a pipeline{% #clone-a-pipeline %}

To clone a pipeline in the UI:

1. Navigate to [Observability Pipelines](https://app.datadoghq.com/observability-pipelines).
1. Select the pipeline you want to clone.
1. Click the cog at the top right side of the page, then select **Clone**.

## Delete a pipeline{% #delete-a-pipeline %}

To delete a pipeline in the UI:

1. Navigate to [Observability Pipelines](https://app.datadoghq.com/observability-pipelines).
1. Select the pipeline you want to delete.
1. Click the cog at the top right side of the page, then select **Delete**.

**Note**: You cannot delete an active pipeline. You must stop all Workers for a pipeline before you can delete it.

## Pipeline requirements and limits{% #pipeline-requirements-and-limits %}

- A pipeline must have at least one destination. If a processor group only has one destination, that destination cannot be deleted.
- For log pipelines:
  - You can add a total of three destinations for a log pipeline.

## Further Reading{% #further-reading %}

- [Update an existing pipeline](https://docs.datadoghq.com/observability_pipelines/configuration/update_existing_pipelines.md)
- [Advanced Worker configurations for Observability Pipelines](https://docs.datadoghq.com/observability_pipelines/configuration/install_the_worker/advanced_worker_configurations.md)
- [Run multiple pipelines on a host](https://docs.datadoghq.com/observability_pipelines/configuration/install_the_worker/run_multiple_pipelines_on_a_host.md)
- [Troubleshooting Observability Pipelines](https://docs.datadoghq.com/observability_pipelines/monitoring_and_troubleshooting/troubleshooting.md)
