---
title: Kafka Source
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > Observability Pipelines > Sources > Kafka Source
---

# Kafka Source

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com

{% alert level="danger" %}
This product is not supported for your selected [Datadog site](https://docs.datadoghq.com/getting_started/site). ().
{% /alert %}

{% /callout %}
Available for:
{% icon name="icon-logs" /%}
 Logs 
## Overview{% #overview %}

Use Observability Pipelines' Kafka source to receive logs from your Kafka topics. The Kafka source uses [librdkafka](https://github.com/confluentinc/librdkafka/tree/master).

You can also [send Azure Event Hub logs to Observability Pipelines using the Kafka source](https://docs.datadoghq.com/observability_pipelines/sources/azure_event_hubs/).

## Prerequisites{% #prerequisites %}

To use Observability Pipelines' Kafka source, you need the following information available:

- The hosts and ports of the Kafka bootstrap servers, which clients should use to connect to the Kafka cluster and discover all the other hosts in the cluster.
- The appropriate TLS certificates and the password you used to create your private key, if your forwarders are globally configured to enable SSL.

## Setup{% #setup %}

Set up this source when you [set up a pipeline](https://docs.datadoghq.com/observability_pipelines/configuration/set_up_pipelines/). You can set up a pipeline in the [UI](https://app.datadoghq.com/observability-pipelines), using the [API](https://docs.datadoghq.com/api/latest/observability-pipelines/), or with [Terraform](https://registry.terraform.io/providers/datadog/datadog/latest/docs/resources/observability_pipeline). The instructions in this section are for setting up the source in the UI.

{% alert level="danger" %}
Only enter the identifiers for the Kafka servers, username, password, and if applicable, the TLS key pass. Do not enter the actual values.
{% /alert %}

1. Enter the identifier for your Kafka servers. If you leave it blank, the default is used.
1. Enter the identifier for your Kafka username. If you leave it blank, the default is used.
1. Enter the identifier for your Kafka password. If you leave it blank, the default is used.
1. Enter the group ID.
1. Enter the topic name. If there is more than one, click **Add Field** to add additional topics.

### Optional settings{% #optional-settings %}

#### Enable SASL Authentication{% #enable-sasl-authentication %}

1. Toggle the switch to enable **SASL Authentication**
1. Select the mechanism (**PLAIN**, **SCHRAM-SHA-256**, or **SCHRAM-SHA-512**) in the dropdown menu.

#### Enable TLS{% #enable-tls %}

Toggle the switch to **Enable TLS**. If you enable TLS, the following certificate and key files are required.**Note**: All file paths are made relative to the configuration data directory, which is `/var/lib/observability-pipelines-worker/config/` by default. See [Advanced Worker Configurations](https://docs.datadoghq.com/observability_pipelines/configuration/install_the_worker/advanced_worker_configurations/) for more information. The file must be owned by the `observability-pipelines-worker group` and `observability-pipelines-worker` user, or at least readable by the group or user.

- Enter the identifier for your Kafka key pass. If you leave it blank, the default is used.
- `Server Certificate Path`: The path to the certificate file that has been signed by your Certificate Authority (CA) root file in DER or PEM (X.509).
- `CA Certificate Path`: The path to the certificate file that is your Certificate Authority (CA) root file in DER or PEM (X.509).
- `Private Key Path`: The path to the `.key` private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.

#### Add additional librdkafka options{% #add-additional-librdkafka-options %}

1. Click **Advanced** and then **Add Option**.
1. Select an option in the dropdown menu.
1. Enter a value for that option.
1. Check your values against the [librdkafka documentation](https://docs.confluent.io/platform/current/clients/librdkafka/html/md_CONFIGURATION.html) to make sure they have the correct type and are within the set range.
1. Click **Add Option** to add another librdkafka option.

## Set secrets{% #set-secrets %}

These are the defaults used for secret identifiers and environment variables.

**Note**: If you enter secret identifiers and then choose to use environment variables, the environment variable is the identifier entered and prepended with `DD_OP`. For example, if you entered `PASSWORD_1` for a password identifier, the environment variable for that password is `DD_OP_PASSWORD_1`.

{% tab title="Secrets Management" %}

- Kafka bootstrap servers identifier:
  - References the bootstrap server that the client uses to connect to the Kafka cluster and discover all the other hosts in the cluster.
  - In your secrets manager, the host and port must be entered in the format of `host:port`, such as `10.14.22.123:9092`. If there is more than one server, use commas to separate them.
  - The default identifier is `SOURCE_KAFKA_BOOTSTRAP_SERVERS`.
- Kafka SASL username identifier:
  - The default identifier is `SOURCE_KAFKA_SASL_USERNAME`.
- Kafka SASL password identifier:
  - The default identifier is `SOURCE_KAFKA_SASL_PASSWORD`.
- Kafka TLS passphrase identifier (when TLS is enabled):
  - The default identifier is `SOURCE_KAFKA_KEY_PASS`.

{% /tab %}

{% tab title="Environment Variables" %}

- The host and port of the Kafka bootstrap servers.
  - The bootstrap server that the client uses to connect to the Kafka cluster and discover all the other hosts in the cluster. The host and port must be entered in the format of `host:port`, such as `10.14.22.123:9092`. If there is more than one server, use commas to separate them.
  - The default environment variable is `DD_OP_SOURCE_KAFKA_BOOTSTRAP_SERVERS`.
- SASL (when enabled):
  - Kafka SASL username
    - The default environment variable is `DD_OP_SOURCE_KAFKA_SASL_USERNAME`.
  - Kafka SASL password
    - The default environment variable is `DD_OP_SOURCE_KAFKA_SASL_PASSWORD`.
- Kafka TLS passphrase (when enabled):
  - The default environment variable is `DD_OP_SOURCE_KAFKA_KEY_PASS`.

{% /tab %}

## librdkafka options{% #librdkafka-options %}

These are the available librdkafka options:

- auto.offset.reset
- auto.commit.interval.ms
- client.id
- coordinator.query.interval.ms
- enable.auto.commit
- enable.auto.offset.store
- fetch.max.bytes
- fetch.message.max.bytes
- fetch.min.bytes
- fetch.wait.max.ms
- group.instance.id
- heartbeat.interval.ms
- queued.min.messages
- session.timeout.ms
- socket.timeout.ms

See the [librdkafka documentation](https://docs.confluent.io/platform/current/clients/librdkafka/html/md_CONFIGURATION.html) for more information and to ensure your values have the correct type and are within range.
