This page guides Technology Partners through the process of creating an official Datadog Agent integration.
Agent-based integrations are designed to collect telemetry from software or systems running on customer-managed infrastructure, where the Datadog Agent is installed or has network access. These integrations use the Datadog Agent to collect and submit data through custom agent checks developed by approved Technology Partners.
Agent checks can emit metrics, events, and logs into a customer’s Datadog account. Each agent-based integration is as a Python package built on top of the Datadog Agent, allowing customers to easily install it through the Datadog Agent. Traces, however, are collected outside of the agent check using one of Datadog’s tracing libraries. For more information, see the Application Instrumentation documentation.
Create and switch to a new branch for your integration:
cd integrations-extras
git switch -c <YOUR_INTEGRATION_NAME> origin/master
Set extras as the default working repository:
ddev config set repo extras
If your repository is stored outside $HOME/dd/, specify the path before setting it as the default:
ddev config set repos.extras "/path/to/integrations-extras"ddev config set repo extras
Create a working directory. The developer tool expects your work to be located in $HOME/dd/:
mkdir $HOME/dd &&cd$HOME/dd
Clone the Datadog/marketplace repository. If you don’t have access, request it from your Datadog contact.
git clone git@github.com:DataDog/marketplace.git
Create and switch to a new branch for your integration:
cd marketplace
git switch -c <YOUR_INTEGRATION_NAME> origin/master
Set marketplace as the default working repository:
ddev config set repo marketplace
If your repository is stored outside $HOME/dd/, specify the path before setting it as the default:
ddev config set repos.marketplace "/path/to/marketplace"ddev config set repo marketplace
Generate your scaffolding
Use the ddev create command to generate the initial file and directory structure for your agent-based integration.
See the Configuration Method tab in the Developer Platform for the correct command for your integration.
Run a dry run (recommended)
Use the -n or --dry-run flag to preview the files that are generated, without writing anything to disk. Confirm that the output path matches the expected repository location.
After verifying the directory location, run the same command without the -n to create the scaffolding. Follow the prompts to provide integration details.
Each agent-based integration centers around an agent check, a Python class that periodically collects telemetry and submits it to Datadog.
Agent checks inherit from the AgentCheck base class and must meet the following requirements:
Python compatibility:
Integrations for Datadog Agent v7+ must support Python 3. All new integrations must target v7+.
Integrations for Datadog Agent v5-v6 use Python 2.7.
Class inheritance: Each check must subclass AgentCheck.
Entry point: Each check must implement a check(self, instance) method.
Package structure: Checks are organized under the datadog_checks namespace. For example, an integration named <INTEGRATION_NAME> lives in: <integration_name>/datadog_checks/<integration_name>/.
Naming:
The package name must match the check name.
Python module and class names within the package can be freely chosen.
Implement check logic
The following example shows logic for an integration named Awesome.
This check defines a service check called awesome.search, which searches a webpage for a specific string:
Returns OK if the string is found.
Returns WARNING if the page loads but the string is missing.
Returns CRITICAL if the page cannot be reached.
To learn how to submit additional data from your check, see:
HTTP Crawler Tutorial for collecting logs from multiple log sources, such as when pollin several endpoints or external HTTP APIs.
The file awesome/datadog_checks/awesome/check.py might look like this:
check.py
importrequestsimporttimefromdatadog_checks.baseimportAgentCheck,ConfigurationErrorclassAwesomeCheck(AgentCheck):"""AwesomeCheck derives from AgentCheck, and provides the required check method."""defcheck(self,instance):url=instance.get('url')search_string=instance.get('search_string')# It's a very good idea to do some basic sanity checking.# Try to be as specific as possible with the exceptions.ifnoturlornotsearch_string:raiseConfigurationError('Configuration error, please fix awesome.yaml')try:response=requests.get(url)response.raise_for_status()# Something went horribly wrongexceptExceptionase:# Ideally we'd use a more specific message...self.service_check('awesome.search',self.CRITICAL,message=str(e))# Submit an error logself.send_log({'message':f'Failed to access {url}: {str(e)}','timestamp':time.time(),'status':'error','service':'awesome','url':url})# Page is accessibleelse:# search_string is presentifsearch_stringinresponse.text:self.service_check('awesome.search',self.OK)# Submit an info logself.send_log({'message':f'Successfully found "{search_string}" at {url}','timestamp':time.time(),'status':'info','service':'awesome','url':url,'search_string':search_string})# search_string was not foundelse:self.service_check('awesome.search',self.WARNING)# Submit a warning logself.send_log({'message':f'String "{search_string}" not found at {url}','timestamp':time.time(),'status':'warning','service':'awesome','url':url,'search_string':search_string})
pytest and hatch are used to run the tests. Tests are required to publish your integration.
Write a unit test
The first part of the check method for Awesome retrieves and verifies two elements from the configuration file. This is a good candidate for a unit test.
Open the file at awesome/tests/test_awesome.py and replace the contents with the following:
test_awesome.py
importpytest# Don't forget to import your integrationfromdatadog_checks.awesomeimportAwesomeCheckfromdatadog_checks.baseimportConfigurationError@pytest.mark.unitdeftest_config():instance={}c=AwesomeCheck('awesome',{},[instance])# empty instancewithpytest.raises(ConfigurationError):c.check(instance)# only the urlwithpytest.raises(ConfigurationError):c.check({'url':'http://foobar'})# only the search stringwithpytest.raises(ConfigurationError):c.check({'search_string':'foo'})# this should not failc.check({'url':'http://foobar','search_string':'foo'})
pytest has the concept of markers that can be used to group tests into categories. Notice that test_config is marked as a unit test.
The scaffolding is set up to run all the tests located in awesome/tests. To run the tests, run the following command:
Next, open the file at awesome/tests/conftest.py and replace the contents with the following:
conftest.py
importosimportpytestfromdatadog_checks.devimportdocker_run,get_docker_hostname,get_hereURL='http://{}:8000'.format(get_docker_hostname())SEARCH_STRING='Thank you for using nginx.'INSTANCE={'url':URL,'search_string':SEARCH_STRING}@pytest.fixture(scope='session')defdd_environment():compose_file=os.path.join(get_here(),'docker-compose.yml')# This does 3 things:## 1. Spins up the services defined in the compose file# 2. Waits for the url to be available before running the tests# 3. Tears down the services when the tests are finishedwithdocker_run(compose_file,endpoints=[URL]):yieldINSTANCE@pytest.fixturedefinstance():returnINSTANCE.copy()
Add an integration test
After you’ve setup an environment for the integration test, add an integration test to the awesome/tests/test_awesome.py file:
test_awesome.py
@pytest.mark.integration@pytest.mark.usefixtures('dd_environment')deftest_service_check(aggregator,instance):c=AwesomeCheck('awesome',{},[instance])# the check should send OKc.check(instance)aggregator.assert_service_check('awesome.search',AwesomeCheck.OK)# the check should send WARNINGinstance['search_string']='Apache'c.check(instance)aggregator.assert_service_check('awesome.search',AwesomeCheck.WARNING)
To speed up development, use the -m/--marker option to run integration tests only:
ddev test -m integration awesome
Test your agent check
Agent-based integrations are distributed as Python wheel (.whl) files that customers install through the Datadog Agent. Before publishing your integration, you can locally test it by manually bulding and installing the wheel package.
Build the wheel
The pyproject.toml file provides the metadata that is used to package and build the wheel. The wheel contains the files necessary for the functioning of the integration itself, which includes the agent check, configuration example file, and artifacts generated during the wheel build.
After your pyproject.toml is ready, create a wheel using one of the following options:
(Recommended) With the ddev tooling: ddev release build <INTEGRATION_NAME>.
Without the ddev tooling: cd <INTEGRATION_DIR> && pip wheel . --no-deps --wheel-dir dist.
Install the wheel
The wheel is installed using the Agent integration command, available in Agent v6.10.0 or later. Depending on your environment, you may need to execute this command as a specific user or with specific privileges:
Open a pull request with your integration directory in the appropriate repo, either Datadog/integrations-extras or Datadog/marketplace. The pull request is reviewed in parallel with your Developer Platform submission.
Updating your integration
After your integration is published, you can release updates through the Developer Platform.
Bumping an integration version
A version bump is needed whenever you add, remove, or modify functionality (for example, when introducing new metrics, updating dashboards, or changing integration code). It’s not required for non-functional updates, such as changes to written content, branding, logos, or images.
In Developer Platform, include a new entry in the Release Notes tab following this format:
## Version Number / Date (YYYY-MM-DD)
***Added***:
* Description of new feature
* Description of new feature
***Fixed***:
* Description of fix
* Description of fix
***Changed***:
* Description of update or improvement
* Description of update or improvement
***Removed***:
* Description of removed feature
* Description of removed feature
Make sure to update all references to the version number across the integration’s documentation and installation instructions.
Further reading
Additional helpful documentation, links, and articles: