Cette page n'est pas encore disponible en français, sa traduction est en cours. Si vous avez des questions ou des retours sur notre projet de traduction actuel, n'hésitez pas à nous contacter.
Overview
This page walks Technology Partners through how to create a Datadog Agent integration, which you can list as out-of-the-box on the Integrations page, or for a price on the Marketplace page.
An Agent-based integration uses the Datadog Agent to submit data through custom checks written by developers. These checks can emit metrics, events, and service checks into a customer’s Datadog account. While the Agent itself can submit logs, this is configured outside of the check.
When to use Agent-based integrations
Agent integrations are best suited for collecting data from systems or applications running with a:
Local Area Network (LAN)
Virtual Private Cloud (VPC)
Agent-based integrations require publishing and deploying as a Python wheel (.whl).
Development process
The process to build an Agent-based integration looks like this:
Join the Datadog Partner Network
Apply to the Datadog Partner Network. Once accepted, an introductory call will be scheduled with the Datadog Technology Partner team.
Set up your development environment
Request a Datadog sandbox account through the Datadog Partner Network portal.
Install the necessary development tools.
Create your integration
Within your Datadog sandbox, navigate to Developer Platform > add a new listing.
Select a tab for instructions on building an out-of-the-box Agent-based integration on the Integrations page, or an Agent-based integration on the Marketplace page.
To build an out-of-the-box integration:
Create a dd directory:
mkdir $HOME/dd &&cd$HOME/dd
The Datadog Development Toolkit expects you to work in the $HOME/dd/ directory. This is not mandatory, but working in a different directory requires additional configuration steps.
The Agent Integration Developer Tool allows you to create scaffolding when you are developing an integration by generating a skeleton of your integration tile’s assets and metadata. For instructions on installing the tool, see Install the Datadog Agent Integration Developer Tool.
To configure the tool for the integrations-extras repository:
Optionally, if your integrations-extras repo is somewhere other than $HOME/dd/, adjust the ddev configuration file:
ddev config set repos.extras "/path/to/integrations-extras"
Set integrations-extras as the default working repository:
mkdir $HOME/dd```The Datadog Development Toolkit command expects you to be working in the `$HOME/dd/` directory. This is not mandatory, but working in a different directory requires additional configuration steps.
Once you have been granted access to the Marketplace repository, create the dd directory and clone the marketplace repository:
Install and configure the Datadog development toolkit
The Agent Integration Developer Tool allows you to create scaffolding when you are developing an integration by generating a skeleton of your integration tile’s assets and metadata. For instructions on installing the tool, see Install the Datadog Agent Integration Developer Tool.
Once you have installed the Agent Integration Developer Tool, configure it for the Marketplace repository.
Set marketplace as the default working repository:
ddev config set repos.marketplace $HOME/dd/marketplace
ddev config set repo marketplace
If you used a directory other than $HOME/dd to clone the marketplace directory, use the following command to set your working repository:
ddev config set repos.marketplace <PATH/TO/MARKETPLACE>
ddev config set repo marketplace
Create your integration
Once you’ve downloaded Docker, installed an appropriate version of Python, and prepared your development environment, you can start creating an Agent-based integration.
The following instructions use an example integration called Awesome. Follow along using the code from Awesome, or replace Awesome with your own code, as well as the name of your integration within the commands. For example, use ddev create <your-integration-name> instead of ddev create Awesome.
Create scaffolding for your integration
The ddev create command runs an interactive tool that creates the basic file and path structure (or scaffolding) necessary for an Agent-based integration.
Before you create your first integration directory, try a dry-run using the -n/--dry-run flag, which doesn’t write anything to the disk:
ddev create -n Awesome
This command displays the path where the files would have been written, as well as the structure itself. Make sure the path in the first line of output matches your repository location.
Run the command without the -n flag. The tool asks you for an email and name and then creates the files you need to get started with an integration.
If you are creating an integration for the Datadog Marketplace, ensure that your directory follows the pattern of {partner name}_{integration name}.
ddev create Awesome
Write an Agent check
At the core of each Agent-based integration is an Agent Check that periodically collects information and sends it to Datadog.
Checks inherit their logic from the AgentCheck base class and have the following requirements:
Integrations running on the Datadog Agent v7 or later must be compatible with Python 3. Integrations running on the Datadog Agent v5 and v6 still use Python 2.7.
Checks must derive from AgentCheck.
Checks must provide a method with this signature: check(self, instance).
Checks are organized in regular Python packages under the datadog_checks namespace. For example, the code for Awesome lives in the awesome/datadog_checks/awesome/ directory.
The name of the package must be the same as the check name.
There are no restrictions on the name of the Python modules within that package, nor on the name of the class implementing the check.
Implement check logic
For Awesome, the Agent Check is composed of a service check named awesome.search that searches for a string on a web page. It results in OK if the string is present, WARNING if the page is accessible but the string was not found, and CRITICAL if the page is inaccessible.
To learn how to submit metrics with your Agent Check, see Custom Agent Check.
The code contained within awesome/datadog_checks/awesome/check.py looks something like this:
check.py
importrequestsfromdatadog_checks.baseimportAgentCheck,ConfigurationErrorclassAwesomeCheck(AgentCheck):"""AwesomeCheck derives from AgentCheck, and provides the required check method."""defcheck(self,instance):url=instance.get('url')search_string=instance.get('search_string')# It's a very good idea to do some basic sanity checking.# Try to be as specific as possible with the exceptions.ifnoturlornotsearch_string:raiseConfigurationError('Configuration error, please fix awesome.yaml')try:response=requests.get(url)response.raise_for_status()# Something went horribly wrongexceptExceptionase:# Ideally we'd use a more specific message...self.service_check('awesome.search',self.CRITICAL,message=str(e))# Page is accessibleelse:# search_string is presentifsearch_stringinresponse.text:self.service_check('awesome.search',self.OK)# search_string was not foundelse:self.service_check('awesome.search',self.WARNING)
pytest and hatch are used to run the tests. Tests are required in order to publish your integration.
Write a unit test
The first part of the check method for Awesome retrieves and verifies two elements from the configuration file. This is a good candidate for a unit test.
Open the file at awesome/tests/test_awesome.py and replace the contents with the following:
test_awesome.py
importpytest# Don't forget to import your integrationfromdatadog_checks.awesomeimportAwesomeCheckfromdatadog_checks.baseimportConfigurationError@pytest.mark.unitdeftest_config():instance={}c=AwesomeCheck('awesome',{},[instance])# empty instancewithpytest.raises(ConfigurationError):c.check(instance)# only the urlwithpytest.raises(ConfigurationError):c.check({'url':'http://foobar'})# only the search stringwithpytest.raises(ConfigurationError):c.check({'search_string':'foo'})# this should not failc.check({'url':'http://foobar','search_string':'foo'})
pytest has the concept of markers that can be used to group tests into categories. Notice that test_config is marked as a unit test.
The scaffolding is set up to run all the tests located in awesome/tests. To run the tests, run the following command:
Next, open the file at awesome/tests/conftest.py and replace the contents with the following:
conftest.py
importosimportpytestfromdatadog_checks.devimportdocker_run,get_docker_hostname,get_hereURL='http://{}:8000'.format(get_docker_hostname())SEARCH_STRING='Thank you for using nginx.'INSTANCE={'url':URL,'search_string':SEARCH_STRING}@pytest.fixture(scope='session')defdd_environment():compose_file=os.path.join(get_here(),'docker-compose.yml')# This does 3 things:## 1. Spins up the services defined in the compose file# 2. Waits for the url to be available before running the tests# 3. Tears down the services when the tests are finishedwithdocker_run(compose_file,endpoints=[URL]):yieldINSTANCE@pytest.fixturedefinstance():returnINSTANCE.copy()
Add an integration test
After you’ve setup an environment for the integration test, add an integration test to the awesome/tests/test_awesome.py file:
test_awesome.py
@pytest.mark.integration@pytest.mark.usefixtures('dd_environment')deftest_service_check(aggregator,instance):c=AwesomeCheck('awesome',{},[instance])# the check should send OKc.check(instance)aggregator.assert_service_check('awesome.search',AwesomeCheck.OK)# the check should send WARNINGinstance['search_string']='Apache'c.check(instance)aggregator.assert_service_check('awesome.search',AwesomeCheck.WARNING)
To speed up development, use the -m/--marker option to run integration tests only:
ddev test -m integration awesome
Your integration is almost complete. Return to the Developer Platform in your sandbox to finalize your submission.
Build the wheel
The pyproject.toml file provides the metadata that is used to package and build the wheel. The wheel contains the files necessary for the functioning of the integration itself, which includes the Agent Check, configuration example file, and artifacts generated during the wheel build.
All additional elements, including the metadata files, are not meant to be contained within the wheel, and are used elsewhere by the Datadog platform and ecosystem.
Once your pyproject.toml is ready, create a wheel using one of the following options:
(Recommended) With the ddev tooling: ddev release build <INTEGRATION_NAME>.
Without the ddev tooling: cd <INTEGRATION_DIR> && pip wheel . --no-deps --wheel-dir dist.
Install the wheel
The wheel is installed using the Agent integration command, available in Agent v6.10.0 or later. Depending on your environment, you may need to execute this command as a specific user or with specific privileges:
Follow the steps outlined within the Developer Platform to submit your Agent check code for review in GitHub. The pull request will be released with your integration upon approval.
Update your integration
If you are editing or adding new integration code, a version bump is required.
If you are editing or adding new README content, manifest information, or assets such as dashboards and monitor templates, a version bump is not needed.
Bumping an integration version
In addition to any code changes, the following is required when bumping an integration version:
Update __about__.py to reflect the new version number. This file can be found in your integration’s directory under /datadog_checks/<your_check_name>/__about__.py.
Add an entry to the Release Notes in the Developer Platform that adheres to the following format:
## Version Number / Date in YYYY-MM-DD
***Added***:
* New feature
* New feature
***Fixed***:
* Bug fix
* Bug fix
***Changed***:
* Feature update
* Feature update
***Removed***:
* Feature removal
* Feature removal
Update all references to the version number mentioned in installation instructions and elsewhere. Installation instructions often include the version number, which needs to be updated.
Further reading
Documentation, liens et articles supplémentaires utiles: