Datadog-Apache Kafka Consumer Integration

Overview

This Agent check only collects metrics for message offsets. If you want to collect metrics about the Kafka brokers themselves, see the kafka check.

This check fetches the highwater offsets from the Kafka brokers, consumer offsets for old-style consumers that store their offsets in zookeeper, and the calculated consumer lag (which is the difference between those two metrics).

This check does NOT support Kafka versions > 0.8—it can’t collect consumer offsets for new-style consumer groups which store their offsets in Kafka. If run such a version of Kafka, track this issue on GitHub.

Setup

Installation

The Agent’s Kafka consumer check is packaged with the Agent, so simply install the Agent on your Kafka nodes. If you need the newest version of the check, install the dd-check-kafka-consumer package.

Configuration

Create a kafka_consumer.yaml file using this sample conf file as an example. Then restart the Datadog Agent to start sending metrics to Datadog.

Validation

Run the Agent’s info subcommand and look for kafka_consumer under the Checks section:

  Checks
  ======
    [...]

    kafka_consumer
    -------
      - instance #0 [OK]
      - Collected 26 metrics, 0 events & 1 service check

    [...]

Compatibility

The kafka_consumer check is compatible with all major platforms.

Data Collected

Metrics

kafka.broker_offset
(gauge)
Current message offset on broker.
shown as offset
kafka.consumer_lag
(gauge)
Lag in messages between consumer and broker.
shown as offset
kafka.consumer_offset
(gauge)
Current message offset on consumer.
shown as offset

Events

consumer_lag:

The Datadog Agent emits an event when the value of the consumer_lag metric goes below 0, tagging it with topic, partition and consumer_group.

Service Checks

The Kafka-consumer check does not include any service check at this time.

Troubleshooting

Specifying a non existent partition in your kafka_Consumer.yaml file

If you get this error in your info.log:

instance - #0 [Error]: ''

Specify the specific partition of your environment for your topic in your kafka_Consumer.yaml file:

#my_topic [0, 1, 4, 12]

Further Reading

Further Reading

Further Reading