To consider an Agent-based integration complete, and thus ready to be included in the core repository and bundled with the Agent package, a number of prerequisites must be met:
README.md
file with the correct format and contentsmetadata.csv
file listing all of the collected metricsmanifest.json
fileservice_checks.json
must be complete as wellThese requirements are used during the code review process as a checklist. This documentation covers the requirements and implementation details for a brand new integration.
In general, creating and activating Python virtual environments to isolate the development environment is good practice; however, it is not mandatory. For more information, see the Python Environment documentation.
Clone the integrations-extras repository. By default, that tooling expects you to be working in the $HOME/dd/
directory—this is optional and can be adjusted via configuration later.
mkdir $HOME/dd && cd $HOME/dd # optional
git clone https://github.com/DataDog/integrations-extras.git
The Developer Toolkit is comprehensive and includes a lot of functionality. Here’s what you need to get started:
pip3 install "datadog-checks-dev[cli]"
If you chose to clone this repository to somewhere other than $HOME/dd/
, you’ll need to adjust the configuration file:
ddev config set extras "/path/to/integrations-extras"
If you intend to work primarily on integrations-extras
, set it as the default working repository:
ddev config set repo extras
Note: If you do not do this step, you’ll need to use -e
for every invocation to ensure the context is integrations-extras
:
ddev -e COMMAND [OPTIONS]
One of the developer toolkit features is the create
command, which creates the basic file and path structure (or “scaffolding”) necessary for a new integration.
Let’s try a dry-run using the -n/--dry-run
flag, which won’t write anything to disk.
ddev create -n Awesome
This displays the path where the files would have been written, as well as the structure itself. For now, just make sure that the path in the first line of output matches your Extras repository.
The interactive mode is a wizard for creating new integrations. By answering a handful of questions, the scaffolding will be set up and lightly pre-configured for you.
ddev create Awesome
After answering the questions, the output matches that of the dry-run above, except in this case the scaffolding for your new integration actually exists!
A Check is a Python class with the following requirements:
AgentCheck
check(self, instance)
Checks are organized in regular Python packages under the datadog_checks
namespace, so your code should live under awesome/datadog_checks/awesome
. The only requirement is that the name of the package has to be the same as the check name. There are no particular restrictions on the name of the Python modules within that package, nor on the name of the class implementing the check.
Let’s say you want to create an Agent Check composed only of a Service Check named awesome.search
that searches for a string on a web page. It will result in OK
if the string is present, WARNING
if the page is accessible but the string was not found, and CRITICAL
if the page is inaccessible. See the Metric Submission: Custom Agent Check if you want to learn how to submit metrics with your Agent Check.
The code contained within awesome/datadog_checks/awesome/check.py
looks something like this:
import requests
from datadog_checks.base import AgentCheck, ConfigurationError
class AwesomeCheck(AgentCheck):
"""AwesomeCheck derives from AgentCheck, and provides the required check method."""
def check(self, instance):
url = instance.get('url')
search_string = instance.get('search_string')
# It's a very good idea to do some basic sanity checking.
# Try to be as specific as possible with the exceptions.
if not url or not search_string:
raise ConfigurationError('Configuration error, please fix awesome.yaml')
try:
response = requests.get(url)
response.raise_for_status()
# Something went horribly wrong
except Exception as e:
# Ideally we'd use a more specific message...
self.service_check('awesome.search', self.CRITICAL, message=str(e))
# Page is accessible
else:
# search_string is present
if search_string in response.text:
self.service_check('awesome.search', self.OK)
# search_string was not found
else:
self.service_check('awesome.search', self.WARNING)
To learn more about the base Python class, see the Python API documentation.
There are two basic types of tests:
check
method and verify proper metrics collection.Tests are required if you want your integration to be included in integrations-extras
. Note that pytest and tox are used to run the tests.
For more information, see the Datadog Checks Dev documentation.
The first part of the check
method retrieves and verifies two elements from the configuration file. This is a good candidate for a unit test. Open the file at awesome/tests/test_awesome.py
and replace the contents with something like this:
import pytest
# Don't forget to import your integration!
from datadog_checks.awesome import AwesomeCheck
from datadog_checks.base import ConfigurationError
@pytest.mark.unit
def test_config():
instance = {}
c = AwesomeCheck('awesome', {}, [instance])
# empty instance
with pytest.raises(ConfigurationError):
c.check(instance)
# only the url
with pytest.raises(ConfigurationError):
c.check({'url': 'http://foobar'})
# only the search string
with pytest.raises(ConfigurationError):
c.check({'search_string': 'foo'})
# this should not fail
c.check({'url': 'http://foobar', 'search_string': 'foo'})
pytest
has the concept of markers that can be used to group tests into categories. Notice that test_config
is marked as a unit
test.
The scaffolding has already been set up to run all tests located in awesome/tests
. Run the tests:
ddev test awesome
This test doesn’t check the collection logic though, so let’s add an integration test. docker
is used to spin up an Nginx container and let the check retrieve the welcome page. Create a compose file at awesome/tests/docker-compose.yml
with the following contents:
version: "3"
services:
nginx:
image: nginx:stable-alpine
ports:
- "8000:80"
Now, open the file at awesome/tests/conftest.py
and replace the contents with something like this:
import os
import pytest
from datadog_checks.dev import docker_run, get_docker_hostname, get_here
URL = 'http://{}:8000'.format(get_docker_hostname())
SEARCH_STRING = 'Thank you for using nginx.'
INSTANCE = {'url': URL, 'search_string': SEARCH_STRING}
@pytest.fixture(scope='session')
def dd_environment():
compose_file = os.path.join(get_here(), 'docker-compose.yml')
# This does 3 things:
#
# 1. Spins up the services defined in the compose file
# 2. Waits for the url to be available before running the tests
# 3. Tears down the services when the tests are finished
with docker_run(compose_file, endpoints=[URL]):
yield INSTANCE
@pytest.fixture
def instance():
return INSTANCE.copy()
Finally, add an integration test to the awesome/tests/test_awesome.py
file:
@pytest.mark.integration
@pytest.mark.usefixtures('dd_environment')
def test_service_check(aggregator, instance):
c = AwesomeCheck('awesome', {}, [instance])
# the check should send OK
c.check(instance)
aggregator.assert_service_check('awesome.search', AwesomeCheck.OK)
# the check should send WARNING
instance['search_string'] = 'Apache'
c.check(instance)
aggregator.assert_service_check('awesome.search', AwesomeCheck.WARNING)
Run only integration tests for faster development using the -m/--marker
option:
ddev test -m integration awesome
The check is almost done. Let’s add the final touches by adding the integration configurations.
The set of assets created by the ddev scaffolding must be populated in order for a check to be considered for inclusion:
README.md
: This contains the documentation for your Check, how to set it up, which data it collects, etc…spec.yaml
: This is used to generate conf.yaml.example
using the ddev
tooling (see the “Configuration template” tab below). See the configuration specification documentation to learn more.conf.yaml.example
: This contains default (or example) configuration options for your Agent Check. Do not edit this file by hand! It is generated from the contents of spec.yaml
. See the configuration file reference documentation to learn its logic.manifest.json
: This contains the metadata for your Agent Check such as the title, categories, etc… See the manifest reference documentation to learn more.metadata.csv
: This contains the list of all metrics collected by your Agent Check. See the metrics metadata reference documentation to learn more.service_check.json
: This contains the list of all Service Checks collected by your Agent Check. See the Service Check reference documentation to learn more.For this example, those files would have the following form:
The awesome/assets/configuration/spec.yaml
used to generate awesome/datadog_checks/awesome/data/conf.yaml.example
:
name: Awesome
files:
- name: awesome.yaml
options:
- template: init_config
options:
- template: init_config/default
- template: instances
options:
- name: url
required: true
description: The URL to check.
value:
type: string
example: http://example.org
- name: search_string
required: true
description: The string to search for.
value:
type: string
example: Example Domain
- name: flag_follow_redirects
# required: false is implicit; comment it to see what happens!
required: false
description: Follow 301 redirects.
value:
type: boolean
example: false
# Try transposing these templates to see what happens!
#- template: instances/http
- template: instances/default
Generate conf.yaml.example
using ddev
:
ddev validate config --sync awesome
The awesome/manifest.json
for the Awesome Service Check. Note that the guid
must be unique (and valid), so do not use the one from this example (the tooling will generate one for you in any case):
{
"display_name": "awesome",
"maintainer": "email@example.org",
"manifest_version": "1.0.0",
"name": "awesome",
"metric_prefix": "awesome.",
"metric_to_check": "",
"creates_events": false,
"short_description": "",
"guid": "x16b8750-df1e-46c0-839a-2056461b604x",
"support": "contrib",
"supported_os": ["linux", "mac_os", "windows"],
"public_title": "Datadog-awesome Integration",
"categories": ["web"],
"type": "check",
"is_public": false,
"integration_id": "awesome",
"assets": {
"dashboards": {
"Awesome Overview": "assets/dashboards/overview.json",
"Awesome Investigation Dashboard": "assets/dashboards/investigation.json"
},
"monitors": {},
"service_checks": "assets/service_checks.json"
}
}
The example integration doesn’t send any metrics, so in this case the generated awesome/metadata.csv
contains only the row containing CSV column names.
The example integration contains a service check, so you need to add it to the awesome/assets/service_checks.json
file:
[
{
"agent_version": "6.0.0",
"integration": "awesome",
"check": "awesome.search",
"statuses": ["ok", "warning", "critical"],
"groups": [],
"name": "Awesome search!",
"description": "Returns `CRITICAL` if the check can't access the page, `WARNING` if the search string was not found, or `OK` otherwise."
}
]
setup.py
provides the setuptools setup script that helps us package and build the wheel. To learn more about Python packaging, take a look at the official Python documentation.
Once your setup.py
is ready, create a wheel:
ddev
tooling (recommended): ddev release build <INTEGRATION_NAME>
ddev
tooling: cd <INTEGRATION_DIR> && python setup.py bdist_wheel
The wheel contains only the files necessary for the functioning of the integration itself. This includes the Check itself, the configuration example file, and some artifacts generated during the build of the wheel. All of the other elements, including the metadata files are not meant to be contained within the wheel. These latter elements are used elsewhere by the greater Datadog platform and eco-system.
The wheel is installed via the Agent integration
command, available in Agent v6.10.0 and up. Depending on your environment, you may need to execute this command as a specific user or with particular privileges:
Linux (as dd-agent
):
sudo -u dd-agent datadog-agent integration install -w /path/to/wheel.whl
OSX (as admin):
sudo datadog-agent integration install -w /path/to/wheel.whl
Windows (Ensure that your shell session has administrator privileges):
For Agent versions <= 6.11:
"C:\Program Files\Datadog\Datadog Agent\embedded\agent.exe" integration install -w /path/to/wheel.whl
For Agent versions >= 6.12:
"C:\Program Files\Datadog\Datadog Agent\bin\agent.exe" integration install -w /path/to/wheel.whl