Test Optimization is not available in the selected site () at this time.

Compatibility

Supported test frameworks:

Test FrameworkVersion
JUnit 4>= 4.10
JUnit 5>= 5.3
TestNG>= 6.4
Spock>= 2.0
Cucumber>= 5.4.0
Karate>= 1.0.0
Scalatest>= 3.0.8
Scala MUnit>= 0.7.28

If your test framework is not supported, you can try instrumenting your tests using Manual Testing API.

Supported build systems:

Build SystemVersion
Gradle>= 2.0
Maven>= 3.2.1

Other build systems, such as Ant or Bazel, are supported with the following limitations:

  • Automatic coverage configuration and reporting is not supported.
  • When building a multi-module project, every module is reported in a separate trace.

Setup

You may follow interactive setup steps on the Datadog site or the instructions below.

Configuring the Datadog Java Tracer varies depending on your CI provider.

We support auto-instrumentation for the following CI providers:

CI ProviderAuto-Instrumentation method
GitHub ActionsDatadog Test Visibility Github Action
JenkinsUI-based configuration with Datadog Jenkins plugin
GitLabDatadog Test Visibility GitLab Script
CircleCIDatadog Test Visibility CircleCI Orb

If you are using auto-instrumentation for one of these providers, you can skip the rest of the setup steps below.

If you are using a cloud CI provider without access to the underlying worker nodes, such as GitHub Actions or CircleCI, configure the library to use the Agentless mode. For this, set the following environment variables:

DD_CIVISIBILITY_AGENTLESS_ENABLED=true (Required)
Enables or disables Agentless mode.
Default: false
DD_API_KEY (Required)
The Datadog API key used to upload the test results.
Default: (empty)

Additionally, configure the Datadog site to which you want to send data.

DD_SITE (Required)
The Datadog site to upload results to.
Default: datadoghq.com

If you are running tests on an on-premises CI provider, such as Jenkins or self-managed GitLab CI, install the Datadog Agent on each worker node by following the Agent installation instructions. This is the recommended option as it allows you to automatically link test results to logs and underlying host metrics.

If you are using a Kubernetes executor, Datadog recommends using the Datadog Operator. The operator includes Datadog Admission Controller which can automatically inject the tracer library into the build pods. Note: If you use the Datadog Operator, there is no need to download and inject the tracer library since the Admission Controller can do this for you, so you can skip the corresponding step below. However, you still need to make sure that your pods set the environment variables or command-line parameters necessary to enable Test Visibility.

If you are not using Kubernetes or can’t use the Datadog Admission Controller and the CI provider is using a container-based executor, set the DD_TRACE_AGENT_URL environment variable (which defaults to http://localhost:8126) in the build container running the tracer to an endpoint that is accessible from within that container. Note: Using localhost inside the build references the container itself and not the underlying worker node or any container where the Agent might be running in.

DD_TRACE_AGENT_URL includes the protocol and port (for example, http://localhost:8126) and takes precedence over DD_AGENT_HOST and DD_TRACE_AGENT_PORT, and is the recommended configuration parameter to configure the Datadog Agent’s URL for CI Visibility.

If you still have issues connecting to the Datadog Agent, use the Agentless Mode. Note: When using this method, tests are not correlated with logs and infrastructure metrics.

Downloading tracer library

You only need to download the tracer library once for each server.

If the tracer library is already available locally on the server, you can proceed directly to running the tests.

Declare DD_TRACER_FOLDER variable with the path to the folder where you want to store the downloaded tracer JAR:

export DD_TRACER_FOLDER=... // e.g. ~/.datadog

Run the command below to download the tracer JAR to the specified folder:

wget -O $DD_TRACER_FOLDER/dd-java-agent.jar 'https://dtdg.co/latest-java-tracer'

You can run the java -jar $DD_TRACER_FOLDER/dd-java-agent.jar command to check the version of the tracer library.

Running your tests

Set the following environment variables to configure the tracer:

DD_CIVISIBILITY_ENABLED=true (Required)
Enables the Test Optimization product.
DD_ENV (Required)
Environment where the tests are being run (for example: local when running tests on a developer workstation or ci when running them on a CI provider).
DD_SERVICE (Required)
Name of the service or library being tested.
DD_TRACER_FOLDER (Required)
Path to the folder where the downloaded Java Tracer is located.
MAVEN_OPTS=-javaagent:$DD_TRACER_FOLDER/dd-java-agent.jar (Required)
Injects the tracer into the Maven build process.

Run your tests as you normally do (for example: mvn test or mvn verify).

Make sure to set the DD_TRACER_FOLDER variable to the path where you have downloaded the tracer.

Run your tests using the org.gradle.jvmargs system property to specify the path to the Datadog Java Tracer JAR.

When specifying tracer arguments, include the following:

  • Enable Test Optimization by setting the dd.civisibility.enabled property to true.
  • Define the environment where the tests are being run using the dd.env property (for example: local when running tests on a developer workstation or ci when running them on a CI provider).
  • Define the name of the service or library being tested in the dd.service property.

For example:

./gradlew cleanTest test -Dorg.gradle.jvmargs=\
-javaagent:$DD_TRACER_FOLDER/dd-java-agent.jar=\
dd.civisibility.enabled=true,\
dd.env=ci,\
dd.service=my-java-app

Specifying org.gradle.jvmargs in the command line overrides the value specified elsewhere. If you have this property specified in a gradle.properties file, be sure to replicate the necessary settings in the command line invocation.

Set the following environment variables to configure the tracer:

DD_CIVISIBILITY_ENABLED=true (Required)
Enables Test Optimization.
DD_ENV (Required)
Environment where the tests are being run (for example: local when running tests on a developer workstation or ci when running them on a CI provider).
DD_SERVICE (Required)
Name of the service or library being tested.
DD_TRACER_FOLDER (Required)
Path to the folder where the downloaded Java Tracer is located.
JAVA_TOOL_OPTIONS=-javaagent:$DD_TRACER_FOLDER/dd-java-agent.jar (Required)
Injects the tracer into the JVMs that execute your tests.

Run your tests as you normally do.

Configuration

Default configuration values work well in most cases.

However, if there is a need to fine-tune the tracer’s behavior, Datadog Tracer configuration options can be used.

Collecting Git metadata

Datadog uses Git information for visualizing your test results and grouping them by repository, branch, and commit. Git metadata is automatically collected by the test instrumentation from CI provider environment variables and the local .git folder in the project path, if available.

If you are running tests in non-supported CI providers or with no .git folder, you can set the Git information manually using environment variables. These environment variables take precedence over any auto-detected information. Set the following environment variables to provide Git information:

DD_GIT_REPOSITORY_URL
URL of the repository where the code is stored. Both HTTP and SSH URLs are supported.
Example: git@github.com:MyCompany/MyApp.git, https://github.com/MyCompany/MyApp.git
DD_GIT_BRANCH
Git branch being tested. Leave empty if providing tag information instead.
Example: develop
DD_GIT_TAG
Git tag being tested (if applicable). Leave empty if providing branch information instead.
Example: 1.0.1
DD_GIT_COMMIT_SHA
Full commit hash.
Example: a18ebf361cc831f5535e58ec4fae04ffd98d8152
DD_GIT_COMMIT_MESSAGE
Commit message.
Example: Set release number
DD_GIT_COMMIT_AUTHOR_NAME
Commit author name.
Example: John Smith
DD_GIT_COMMIT_AUTHOR_EMAIL
Commit author email.
Example: john@example.com
DD_GIT_COMMIT_AUTHOR_DATE
Commit author date in ISO 8601 format.
Example: 2021-03-12T16:00:28Z
DD_GIT_COMMIT_COMMITTER_NAME
Commit committer name.
Example: Jane Smith
DD_GIT_COMMIT_COMMITTER_EMAIL
Commit committer email.
Example: jane@example.com
DD_GIT_COMMIT_COMMITTER_DATE
Commit committer date in ISO 8601 format.
Example: 2021-03-12T16:00:28Z

Extensions

The tracer exposes a set of APIs that can be used to extend its functionality programmatically.

Adding custom tags to tests

To add custom tags include opentracing-util library as a compile-time dependency to your project.

You can then add custom tags to your tests by using the active span:

import io.opentracing.Span;
import io.opentracing.util.GlobalTracer;

// ...
// inside your test
final Span span = GlobalTracer.get().activeSpan();
if (span != null) {
  span.setTag("test_owner", "my_team");
}
// test continues normally
// ...

To create filters or group by fields for these tags, you must first create facets.

For more information about adding tags, see the Adding Tags section of the Java custom instrumentation documentation.

Adding custom measures to tests

Just like tags, you can add custom measures to your tests by using the current active span:

import io.opentracing.Span;
import io.opentracing.util.GlobalTracer;

// ...
// inside your test
final Span span = GlobalTracer.get().activeSpan();
if (span != null) {
  span.setTag("test.memory.usage", 1e8);
}
// test continues normally
// ...

For more information about custom measures, see the Add Custom Measures guide.

Using manual testing API

If you use one of the supported testing frameworks, the Java Tracer automatically instruments your tests and sends the results to the Datadog backend.

If you are using a framework that is not supported, or an ad-hoc testing solution, you can harness the manual testing API, which also reports test results to the backend.

To use the manual testing API, add the dd-trace-api library as a compile-time dependency to your project.

Domain model

The API is based around four concepts: test session, test module, test suite, and test.

Test session

A test session represents a project build, which typically corresponds to execution of a test command issued by a user or by a CI script.

To start a test session, call datadog.trace.api.civisibility.CIVisibility#startSession and pass the name of the project and the name of the testing framework you used.

When all your tests have finished, call datadog.trace.api.civisibility.DDTestSession#end, which forces the library to send all remaining test results to the backend.

Test module

A test module represents a smaller unit of work within a project build, typically corresponding to a project module. For example, a Maven submodule or Gradle subproject.

To start a test mode, call datadog.trace.api.civisibility.DDTestSession#testModuleStart and pass the name of the module.

When the module has finished building and testing, call datadog.trace.api.civisibility.DDTestModule#end.

Test Suite

A test suite comprises a set of tests that share common functionality. They can share a common initialization and teardown, and can also share some variables. A single suite usually corresponds to a Java class that contains test cases.

Create test suites in a test module by calling datadog.trace.api.civisibility.DDTestModule#testSuiteStart and passing the name of the test suite.

Call datadog.trace.api.civisibility.DDTestSuite#end when all the related tests in the suite have finished their execution.

Test

A test represents a single test case that is executed as part of a test suite. Usually it corresponds to a method that contains testing logic.

Create tests in a suite by calling datadog.trace.api.civisibility.DDTestSuite#testStart and passing the name of the test.

Call datadog.trace.api.civisibility.DDTest#end when a test has finished execution.

Code Example

The following code represents a simple usage of the API:

package com.datadog.civisibility.example;

import datadog.trace.api.civisibility.CIVisibility;
import datadog.trace.api.civisibility.DDTest;
import datadog.trace.api.civisibility.DDTestModule;
import datadog.trace.api.civisibility.DDTestSession;
import datadog.trace.api.civisibility.DDTestSuite;
import java.lang.reflect.Method;

// the null arguments in the calls below are optional startTime/endTime values:
// when they are not specified, current time is used
public class ManualTest {
    public static void main(String[] args) throws Exception {
        DDTestSession testSession = CIVisibility.startSession("my-project-name", "my-test-framework", null);
        testSession.setTag("my-tag", "additional-session-metadata");
        try {
            runTestModule(testSession);
        } finally {
            testSession.end(null);
        }
    }

    private static void runTestModule(DDTestSession testSession) throws Exception {
        DDTestModule testModule = testSession.testModuleStart("my-module", null);
        testModule.setTag("my-module-tag", "additional-module-metadata");
        try {
            runFirstTestSuite(testModule);
            runSecondTestSuite(testModule);
        } finally {
            testModule.end(null);
        }
    }

    private static void runFirstTestSuite(DDTestModule testModule) throws Exception {
        DDTestSuite testSuite = testModule.testSuiteStart("my-suite", ManualTest.class, null);
        testSuite.setTag("my-suite-tag", "additional-suite-metadata");
        try {
            runTestCase(testSuite);
        } finally {
            testSuite.end(null);
        }
    }

    private static void runTestCase(DDTestSuite testSuite) throws Exception {
        Method myTestCaseMethod = ManualTest.class.getDeclaredMethod("myTestCase");
        DDTest ddTest = testSuite.testStart("myTestCase", myTestCaseMethod, null);
        ddTest.setTag("my-test-case-tag", "additional-test-case-metadata");
        ddTest.setTag("my-test-case-tag", "more-test-case-metadata");
        try {
            myTestCase();
        } catch (Exception e) {
            ddTest.setErrorInfo(e); // pass error info to mark test case as failed
        } finally {
            ddTest.end(null);
        }
    }

    private static void myTestCase() throws Exception {
        // run some test logic
    }

    private static void runSecondTestSuite(DDTestModule testModule) {
        DDTestSuite secondTestSuite = testModule.testSuiteStart("my-second-suite", ManualTest.class, null);
        secondTestSuite.setSkipReason("this test suite is skipped"); // pass skip reason to mark test suite as skipped
        secondTestSuite.end(null);
    }
}

Always call datadog.trace.api.civisibility.DDTestSession#end at the end so that all the test info is flushed to Datadog.

Best practices

Deterministic test parameters representation

Test Optimization works best when the test parameters are deterministic and stay the same between test runs. If a test case has a parameter that varies between test executions (such as a current date, a random number, or an instance of a class whose toString() method is not overridden), some of the product features may not work as expected. For example, the history of executions may not be available, or the test case may not be classified as flaky even if it exhibits flakiness.

The best way to fix this is to make sure that the test parameters are the same between test runs.

In JUnit 5, this can also be addressed by customizing the string representation of the test parameters without changing their values. To do so, use org.junit.jupiter.api.Named interface or change the name parameter of the org.junit.jupiter.params.ParameterizedTest annotation:

@ParameterizedTest
@MethodSource("namedArguments")
void parameterizedTest(String s, Date d) {
   // The second parameter in this test case is non-deterministic.
   // In the argument provider method it is wrapped with Named to ensure it has a deterministic name.
}

static Stream<Arguments> namedArguments() {
    return Stream.of(
            Arguments.of(
                    "a string",
                    Named.of("current date", new Date())),
            Arguments.of(
                    "another string",
                    Named.of("a date in the future", new Date(System.currentTimeMillis() + TimeUnit.DAYS.toMillis(1))))
    );
}
@ParameterizedTest(name = "[{index}] {0}, a random number from one to ten")
@MethodSource("randomArguments")
void anotherParameterizedTest(String s, int i) {
  // The second parameter in this test case is non-deterministic.
  // The name of the parameterized test is customized to ensure it has a deterministic name.
}

static Stream<Arguments> randomArguments() {
    return Stream.of(
            Arguments.of("a string", ThreadLocalRandom.current().nextInt(10) + 1),
            Arguments.of("another string", ThreadLocalRandom.current().nextInt(10) + 1)
    );
}

Troubleshooting

The tests are not appearing in Datadog after enabling Test Optimization in the tracer

Verify that the tracer is injected into your build process by examining your build’s logs. If the injection is successful, you can see a line containing DATADOG TRACER CONFIGURATION. If the line is not there, make sure that the environment variables used to inject and configure the tracer are available to the build process. A common mistake is to set the variables in a build step and run the tests in another build step. This approach may not work if the variables are not propagated between build steps.

Ensure that you are using the latest version of the tracer.

Verify that your build system and testing framework are supported by Test Optimization. See the list of supported build systems and test frameworks.

Ensure that the dd.civisibility.enabled property (or DD_CIVISIBILITY_ENABLED environment variable) is set to true in the tracer arguments.

Try running your build with tracer debug logging enabled by setting the DD_TRACE_DEBUG environment variable to true. Check the build output for any errors that indicate tracer misconfiguration, such as an unset DD_API_KEY environment variable.

Tests or source code compilation fails when building a project with the tracer attached

By default, Test Optimization runs Java code compilation with a compiler plugin attached.

The plugin is optional, as it only serves to reduce the performance overhead.

Depending on the build configuration, adding the plugin can sometimes disrupt the compilation process.

If the plugin interferes with the build, disable it by adding dd.civisibility.compiler.plugin.auto.configuration.enabled=false to the list of -javaagent arguments (or by setting DD_CIVISIBILITY_COMPILER_PLUGIN_AUTO_CONFIGURATION_ENABLED=false environment variable).

Builds fails because dd-javac-plugin-client artifact cannot be found

It is possible that the Java compiler plugin injected into the build is not available if the build uses a custom artifactory storage or if it is run in offline mode.

If this is the case, you can disable plugin injection by adding dd.civisibility.compiler.plugin.auto.configuration.enabled=false to the list of -javaagent arguments (or by setting the DD_CIVISIBILITY_COMPILER_PLUGIN_AUTO_CONFIGURATION_ENABLED environment variable to false).

The plugin is optional, as it only serves to reduce the performance overhead.

Tests fail when building a project with the tracer attached

In some cases attaching the tracer can break tests, especially if they run asserts on the internal state of the JVM or instances of third-party libraries’ classes.

While the best approach is such cases is to update the tests, there is also a quicker option of disabling the tracer’s third-party library integrations.

The integrations provide additional insights into what happens in the tested code and are especially useful in integration tests, to monitor things like HTTP requests or database calls. They are enabled by default.

To disable a specific integration, refer to the Datadog Tracer Compatibility table for the relevant configuration property names. For example, to disable OkHttp3 client request integration, add dd.integration.okhttp-3.enabled=false to the list of -javaagent arguments.

To disable all integrations, augment the list of -javaagent arguments with dd.trace.enabled=false (or set DD_TRACE_ENABLED=false environment variable).

Further reading