CI/CD Integrations Configuration

This page is about configuring Synthetic tests for your continuous integration (CI) pipelines. If you want to bring your CI metrics and data into Datadog dashboards, see the Continuous Integration Visibility section.


Use the @datadog-ci NPM package to run Synthetic tests directly within your CI/CD pipeline. You can automatically halt a build, block a deployment, and roll back a deployment when a Synthetics test detects a regression.

To configure which URL your test starts on, provide a startUrl to your test object. Build your own starting URL with any part of your test’s original starting URL and the following environment variables:

Install a package

The package is published under @datadog/datadog-ci in the NPM registry.

Install the package through NPM:

npm install --save-dev @datadog/datadog-ci

Install the package through Yarn:

yarn add --dev @datadog/datadog-ci

Setup a client

To setup your client, Datadog API and application keys need to be configured. These keys can be defined in three different ways:

  1. As environment variables:

    export DATADOG_API_KEY="<API_KEY>"
  2. Passed to the CLI when running your tests:

    datadog-ci synthetics run-tests --apiKey "<API_KEY>" --appKey "<APPLICATION_KEY>"
  3. Or defined in a global configuration file:

    The global JSON configuration file can specify additional advanced options. Specify the path to this file using the flag --config when launching your tests. If you set the name of your global configuration file to datadog-ci.json, that name is the default.

In the global configuration file, you can configure the following options:

The API key used to query the Datadog API.
The application key used to query the Datadog API.
The Datadog instance to which request is sent. The default is Your Datadog site is .
Glob pattern to detect Synthetic tests config files.
Overrides of Synthetic tests applied to all tests (see below for descriptions of each field).
The proxy to be used for outgoing connections to Datadog. host and port keys are mandatory arguments, protocol key defaults to http. Supported values for protocol key are http, https, socks, socks4, socks4a, socks5, socks5h, pac+data, pac+file, pac+ftp, pac+http, pac+https. The library used to configure the proxy is the proxy-agent library.
The name of the custom subdomain set to access your Datadog application. If the URL used to access Datadog is, the subdomain value then needs to be set to myorg.
Use the secure tunnel to execute your test batch.
Pass a query to select which Synthetic tests to run. If you are running tests in the CLI, use the -s flag.

For example:

Global Configuration File

    "apiKey": "<DATADOG_API_KEY>",
    "datadogSite": "",
    "files": "{,!(node_modules)/**/}*.synthetics.json",
    "global": {
        "allowInsecureCertificates": true,
        "basicAuth": { "username": "test", "password": "test" },
        "body": "{\"fakeContent\":true}",
        "bodyType": "application/json",
        "cookies": "name1=value1;name2=value2;",
        "deviceIds": ["laptop_large"],
        "followRedirects": true,
        "headers": { "<NEW_HEADER>": "<NEW_VALUE>" },
            "locations": ["aws:us-west-1"],
        "retry": { "count": 2, "interval": 300 },
        "executionRule": "blocking",
        "startUrl": "{{URL}}?static_hash={{STATIC_HASH}}",
        "variables": { "titleVariable": "new value" },
        "pollingTimeout": 180000
    "proxy": {
      "auth": {
        "username": "login",
        "password": "pwd"
      "host": "",
      "port": 3128,
      "protocol": "http"
    "subdomain": "subdomainname",
    "tunnel": true

Configure tests

By default, the client automatically discovers and runs all tests specified in **/*.synthetics.json files. This path can be configured in the global configuration file.

These files have a tests key which contains an array of objects with the IDs of the tests to run and any potential test configuration overrides.

For example:

Basic Test Configuration File

    "tests": [
            "id": "<TEST_PUBLIC_ID>"
            "id": "<TEST_PUBLIC_ID>"

Additional configuration

The default configurations used for the tests are the original tests’ configurations, which are visible in the UI or by getting your tests’ configurations from the API.

However, in the context of your CI deployment, you may decide to override some or all of your test parameters with the overrides below. To define overrides for all of your tests, set the same parameters at the global configuration file level.

Type: boolean
Disable certificate checks in HTTP tests.
Type: object
Credentials to provide in case a basic authentication is encountered in HTTP or browser tests.
  • username: string. Username to use in basic authentication.
  • password: string. Password to use in basic authentication.
Type: string
Data to send in HTTP tests.
Type: string
Type of the data sent in HTTP tests.
Type: string
Use provided string as cookie header in HTTP or browser tests.
Type: array
List of devices on which to run the browser test.
Type: boolean
Indicates whether to follow redirections in HTTP tests.
Type: object
Headers to replace in the HTTP or browser test. This object should contain the name of the header to replace as keys and the new value of the header as values.
Type: array
List of locations from which the test runs.
Type: object
Retry policy for the test.
  • count: integer. Number of attempts to perform in case of test failure.
  • interval: integer. Interval between the attempts (in milliseconds).
Type: string
Execution rule of the test that defines the CLI behavior in case of a failing test:
  • blocking: The CLI returns an error if the test fails.
  • non_blocking: The CLI only prints a warning if the test fails.
  • skipped: The test is not executed at all.
Type: string
New start URL to provide to the HTTP or browser test.
Type: object
Variables to replace in the test. This object should contain the name of the variable to replace as keys and the new value of the variable as values.
Type: integer
The duration in milliseconds after which datadog-ci stops polling for test results. The default is 120,000 ms. At the CI level, test results completed after this duration are considered failed.

Note: The test’s overrides take precedence over global overrides.

Advanced Test Configuration File

    "tests": [
            "id": "<TEST_PUBLIC_ID>",
            "config": {
                "allowInsecureCertificates": true,
                "basicAuth": { "username": "test", "password": "test" },
                "body": "{\"fakeContent\":true}",
                "bodyType": "application/json",
                "cookies": "name1=value1;name2=value2;",
                "deviceIds": ["laptop_large"],
                "followRedirects": true,
                "headers": { "<NEW_HEADER>": "<NEW_VALUE>" },
            "locations": ["aws:us-west-1"],
                "retry": { "count": 2, "interval": 300 },
                "executionRule": "skipped",
                "startUrl": "{{URL}}?static_hash={{STATIC_HASH}}",
                "variables": { "titleVariable": "new value" },
                "pollingTimeout": 180000

Execution rule

Use the drop-down menu next to CI Execution to define the execution rule for each test at the test level.

The execution rule associated with the test is the most restrictive one in the configuration file. The options range from most to least restrictive: skipped, non_blocking, and blocking. For example, if your test is configured as skipped in the UI but blocking in the configuration file, it is skipped when your test runs.

Starting URL

Test’s original starting URL
Test’s domain name
Test’s hash
Example: #target
Test’s host
Test’s hostname
Test’s origin
Test’s query parameters
Example: ?abc=123
Test’s URl path
Example: /path/to/something
Test’s host port
Example: 81
Test’s protocol
Example: https:
Test’s sub domain
Example: www

Whether you use Synthetic tests to control your CI/CD deployments in production or staging, you can run Synthetic tests against a generated staging URL instead of in production by setting local environment variables in your test’s starting URL.

To trigger an existing Synthetics test on a staging endpoint instead of in production, set the $SUBDOMAIN environment variable to staging-example and the $PORT environment variable to a port used for staging. Your Synthetic tests run against the generated staging URL instead of running in production.

For example, you can write as:

  • {{URL}}

Note: If you have environment variables with names corresponding to one of the reserved variables above, your environment variables are ignored and replaced with the corresponding component parsed from your test’s startUrl.

Run tests

You can decide to have the CLI auto-discover all your **/*.synthetics.json Synthetic tests (or all the tests associated to the path specified in your global configuration file) or to specify the tests you want to run using the -p,--public-id flag.

Run tests by executing the CLI:

yarn datadog-ci synthetics run-tests

Note: If you are launching your tests with a custom global configuration file, append your command with --config <PATH_TO_GLOBAL_CONFIG_FILE.

Add the following to your package.json:

  "scripts": {
    "datadog-ci-synthetics": "datadog-ci synthetics run-tests"

Then, run:

npm run datadog-ci-synthetics

Note: If you are launching your tests with a custom global configuration file, append the command associated to your datadog-ci-synthetics script with --config <PATH_TO_GLOBAL_CONFIG_FILE.

Use the testing tunnel

The @datadog/datadog-ci NPM package also comes with secure tunnelling, allowing you to trigger Synthetic tests on your internal applications.

The testing tunnel creates an end-to-end encrypted HTTP proxy between your infrastructure and Datadog that allows all test requests sent through the CLI to be automatically routed through the datadog-ci client.

For more information, see Testing Tunnel.

Visualize test results

In your CI

You can see the outcome of test executions directly in your CI as your tests are being executed.

Successful Test Result

You can identify what caused a test to fail by looking at the execution logs and searching for causes of the failed assertion:

Failed Test Result

In the Datadog application

You can also see your CI test results listed in the CI Results Explorer and on test details pages:

CI Results Explorer

Further Reading