Create an Agent Integration


This guide provides instructions for creating a Datadog Agent integration in the integrations-extras repository. For more information about why you would want to create an Agent-based integration, see Creating your own solution.


The required Datadog Agent integration development tools include:

Set up your integrations-extra repo

Follow these instructions to set up your repo for integration development:

  1. Create the dd directory:

    The Datadog Development Toolkit expects you to work in the $HOME/dd/ directory. This is not mandatory, but working in a different directory requires additional configuration steps.

    To create the dd directory and clone the integrations-extras repo:

    mkdir $HOME/dd && cd $HOME/dd
  2. Fork the integrations-extras repo.

  3. Clone your fork into the dd directory:

    git clone<YOUR USERNAME>/integrations-extras.git
  4. Create a feature branch to work in:

    git switch -c <YOUR INTEGRATION NAME> origin/master

Configure the developer tool

Assuming you’ve installed the Agent Integration Developer Tool, configure the tool for the integrations-extras repo:

  1. Optionally, if your integrations-extras repo is somewhere other than $HOME/dd/, adjust the ddev configuration file:

    ddev config set extras "/path/to/integrations-extras"
  2. Set integrations-extras as the default working repository:

    ddev config set repo extras

Create your integration

Once you’ve downloaded Docker, installed an appropriate version of Python, and prepared your development environment, you can get started with creating an Agent-based integration. The instructions below use an example integration called Awesome. Follow along using the code from Awesome, or replace Awesome with your own code.

Create scaffolding for your integration

The ddev create command runs an interactive tool that creates the basic file and path structure (or “scaffolding”) necessary for a new Agent-based integration.

  1. Before you create your first integration directory, try a dry-run using the -n/--dry-run flag, which doesn’t write anything to the disk:

    ddev create -n Awesome

    This command displays the path where the files would have been written, as well as the structure itself. Make sure the path in the first line of output matches your integrations-extras repository location.

  2. Run the command without the -n flag. The tool asks you for an email and name and then creates the files you need to get started with an integration.

    ddev create Awesome

Write an Agent Check

At the core of each Agent-based integration is an Agent Check that periodically collects information and sends it to Datadog. Checks inherit their logic from the AgentCheck base class and have the the following requirements:

  • Integrations running on the Datadog Agent v7 and later must be compatible with Python 3; however, Agents v5 and v6 still use Python 2.7.
  • Checks must derive from AgentCheck.
  • Checks must provide a method with this signature: check(self, instance).
  • Checks are organized in regular Python packages under the datadog_checks namespace. For example, the code for Awesome lives in the awesome/datadog_checks/awesome/ directory.
  • The name of the package must be the same as the check name.
  • There are no restrictions on the name of the Python modules within that package, nor on the name of the class implementing the check.

Implement check logic

For Awesome, the Agent Check is composed of a Service Check named that searches for a string on a web page. It results in OK if the string is present, WARNING if the page is accessible but the string was not found, and CRITICAL if the page is inaccessible. To learn how to submit metrics with your Agent Check, see Custom Agent Check.

The code contained within awesome/datadog_checks/awesome/ looks something like this:

import requests

from datadog_checks.base import AgentCheck, ConfigurationError

class AwesomeCheck(AgentCheck):
    """AwesomeCheck derives from AgentCheck, and provides the required check method."""

    def check(self, instance):
        url = instance.get('url')
        search_string = instance.get('search_string')

        # It's a very good idea to do some basic sanity checking.
        # Try to be as specific as possible with the exceptions.
        if not url or not search_string:
            raise ConfigurationError('Configuration error, please fix awesome.yaml')

            response = requests.get(url)
        # Something went horribly wrong
        except Exception as e:
            # Ideally we'd use a more specific message...
            self.service_check('', self.CRITICAL, message=str(e))
        # Page is accessible
            # search_string is present
            if search_string in response.text:
                self.service_check('', self.OK)
            # search_string was not found
                self.service_check('', self.WARNING)

To learn more about the base Python class, see Anatomy of a Python Check.

Write validation tests

There are two basic types of tests:

pytest and hatch are used to run the tests. Tests are required if you want your integration to be included in the integrations-extras repository.

Write a unit test

The first part of the check method for Awesome retrieves and verifies two elements from the configuration file. This is a good candidate for a unit test. Open the file at awesome/tests/ and replace the contents with the following:

import pytest

    # Don't forget to import your integration

from datadog_checks.awesome import AwesomeCheck
from datadog_checks.base import ConfigurationError

def test_config():
    instance = {}
    c = AwesomeCheck('awesome', {}, [instance])

    # empty instance
    with pytest.raises(ConfigurationError):

    # only the url
    with pytest.raises(ConfigurationError):
        c.check({'url': 'http://foobar'})

    # only the search string
    with pytest.raises(ConfigurationError):
        c.check({'search_string': 'foo'})

    # this should not fail
    c.check({'url': 'http://foobar', 'search_string': 'foo'})

pytest has the concept of markers that can be used to group tests into categories. Notice that test_config is marked as a unit test.

The scaffolding is set up to run all the tests located in awesome/tests.

To run the tests, run:

ddev test awesome

Write an integration test

The unit test above doesn’t check the collection logic. To test the logic, you need to create an environment for an integration test and write an integration test.

Create an environment for the integration test

The toolkit uses docker to spin up an Nginx container and lets the check retrieve the welcome page.

To create an environment for the integration test, create a docker-compose file at awesome/tests/docker-compose.yml with the following contents:


version: "3"

    image: nginx:stable-alpine
      - "8000:80"

Next, open the file at awesome/tests/ and replace the contents with the following:

import os

import pytest

from import docker_run, get_docker_hostname, get_here

URL = 'http://{}:8000'.format(get_docker_hostname())
SEARCH_STRING = 'Thank you for using nginx.'
INSTANCE = {'url': URL, 'search_string': SEARCH_STRING}

def dd_environment():
    compose_file = os.path.join(get_here(), 'docker-compose.yml')

    # This does 3 things:
    # 1. Spins up the services defined in the compose file
    # 2. Waits for the url to be available before running the tests
    # 3. Tears down the services when the tests are finished
    with docker_run(compose_file, endpoints=[URL]):
        yield INSTANCE

def instance():
    return INSTANCE.copy()

Add an integration test

After you’ve setup an environment for the integration test, add an integration test to the awesome/tests/ file:

def test_service_check(aggregator, instance):
    c = AwesomeCheck('awesome', {}, [instance])

    # the check should send OK
    aggregator.assert_service_check('', AwesomeCheck.OK)

    # the check should send WARNING
    instance['search_string'] = 'Apache'
    aggregator.assert_service_check('', AwesomeCheck.WARNING)

To speed up development, use the -m/--marker option to run integration tests only:

ddev test -m integration awesome

Your integration is almost complete. Next, add the necessary check assets.

Create the check assets

The set of assets created by the ddev scaffolding must be populated in order for a check to be considered in integrations-extras:
This contains the documentation for your Agent Check, how to set it up, which data it collects, and support information.
This is used to generate the conf.yaml.example using the ddev tooling (see the Configuration template tab below). For more information, see Configuration specification.
This contains default (or example) configuration options for your Agent Check. Do not edit this file by hand! It is generated from the contents of spec.yaml. For more information, see the Configuration file reference.
This contains the metadata for your Agent Check such as the title and categories. For more information, see the Manifest file reference.
This contains the list of all metrics collected by your Agent Check. For more information, see the Metrics metadata file reference.
This contains the list of all Service Checks collected by your Agent Check. For more information, see the Service check file reference.

For this example, the awesome/assets/configuration/spec.yaml used to generate awesome/datadog_checks/awesome/data/conf.yaml.example appears in the following format:

name: Awesome
- name: awesome.yaml
  - template: init_config
    - template: init_config/default
  - template: instances
    - name: url
      required: true
      description: The URL to check.
        type: string
    - name: search_string
      required: true
      description: The string to search for.
        type: string
        example: Example Domain
    - name: flag_follow_redirects
      # required: false is implicit; comment it to see what happens!
      required: false
      description: Follow 301 redirects.
        type: boolean
        example: false
    # Try transposing these templates to see what happens!
    #- template: instances/http
    - template: instances/default

To generate conf.yaml.example using ddev, run:

ddev validate config --sync awesome

For this example, the awesome/manifest.json for the Awesome Service Check appears in the following format:

  "manifest_version": "2.0.0",
  "app_uuid": "79eb6e54-2110-4d50-86c3-f7037d1a9daa", // Do not use this example UUID. UUIDs must be unique and valid.
  "app_id": "awesome",
  "classifier_tags": [
    "Supported OS::Linux",
    "Supported OS::Mac OS",
    "Supported OS::Windows"
  "display_on_public_website": false,
  "tile": {
    "overview": "",
    "configuration": "",
    "support": "",
    "changelog": "",
    "description": "",
    "title": "Awesome",
    "media": []
  "author": {
    "support_email": ""
  "oauth": {},
  "assets": {
    "integration": {
      "source_type_name": "Awesome",
      "configuration": {
        "spec": "assets/configuration/spec.yaml"
      "events": {
        "creates_events": false
      "metrics": {
        "prefix": "awesome.",
        "check": "",
        "metadata_path": "metadata.csv"
      "service_checks": {
        "metadata_path": "assets/service_checks.json"

For this example, the Awesome integration doesn’t provide any metrics, so in this case, the generated awesome/metadata.csv only contains only a row with the column names.

For this example, the Awesome integration contains a Service Check, so you need to add it to the awesome/assets/service_checks.json file:

    "agent_version": "6.0.0",
    "integration": "awesome",
    "check": "",
    "statuses": ["ok", "warning", "critical"],
    "groups": [],
    "name": "Awesome search!",
    "description": "Returns `CRITICAL` if the check can't access the page, `WARNING` if the search string was not found, or `OK` otherwise."

Build the wheel

The pyproject.toml file provides the metadata that is used to package and build the wheel. The wheel contains the files necessary for the functioning of the integration itself, which includes the Check, configuration example file, and artifacts generated during the build of the wheel.

All additional elements, including the metadata files, are not meant to be contained within the wheel, and are used elsewhere by the Datadog platform and ecosystem. To learn more about Python packaging, see Packaging Python Projects.

Once your pyproject.toml is ready, create a wheel:

  • (Recommended) With the ddev tooling: ddev release build <INTEGRATION_NAME>.
  • Without the ddev tooling: cd <INTEGRATION_DIR> && pip wheel . --no-deps --wheel-dir dist.

Install the wheel

The wheel is installed using the Agent integration command, available in Agent v6.10.0 and up. Depending on your environment, you may need to execute this command as a specific user or with specific privileges:

Linux (as dd-agent):

sudo -u dd-agent datadog-agent integration install -w /path/to/wheel.whl

OSX (as admin):

sudo datadog-agent integration install -w /path/to/wheel.whl

Windows PowerShell (Ensure that your shell session has administrator privileges):

Agent v6.11 or earlier
& "C:\Program Files\Datadog\Datadog Agent\embedded\agent.exe" integration install -w /path/to/wheel.whl
Agentv6.12 or later
& "C:\Program Files\Datadog\Datadog Agent\bin\agent.exe" integration install -w /path/to/wheel.whl

Review the checklist to publishing your integration

After you’ve created your Agent-based integration, refer to this list to make sure your integration contains all the required files and validations:

  • A file with the correct format and contents.
  • A battery of tests verifying metrics collection.
  • A metadata.csv file listing all of the collected metrics.
  • A complete manifest.json file.
  • If the integration collects Service Checks, the service_checks.json must be complete as well.

Before you open a pull request, run the following command to catch any problems with your integration:

ddev validate all <INTEGRATION_NAME>

After you’ve created your pull request, automatic checks run to verify that your pull request is in good shape and contains all the required content to be updated.

Further Reading

Additional helpful documentation, links, and articles: