---
title: Java Tests
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > Test Optimization in Datadog > Configure Test Optimization > Java Tests
---

# Java Tests

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com

{% alert level="danger" %}
This product is not supported for your selected [Datadog site](https://docs.datadoghq.com/getting_started/site). ().
{% /alert %}

{% /callout %}

## Compatibility{% #compatibility %}

Supported test frameworks:

| Test Framework | Version                                         |
| -------------- | ----------------------------------------------- |
| JUnit 4        | \>= 4.10                                        |
| JUnit 5        | \>= 5.3                                         |
| TestNG         | \>= 6.4                                         |
| Spock          | \>= 2.0                                         |
| Cucumber       | \>= 5.4.0                                       |
| Karate         | \>= 1.0.0                                       |
| Scalatest      | \>= 3.0.8                                       |
| Scala MUnit    | \>= 0.7.28                                      |
| Scala Weaver   | \>= 0.8.4 (Only when using SBT as build system) |

If your test framework is not supported, you can try instrumenting your tests using Manual Testing API.

Supported build systems:

| Build System | Version   |
| ------------ | --------- |
| Gradle       | \>= 2.0   |
| Maven        | \>= 3.2.1 |

Other build systems, such as Ant, Bazel, or SBT are supported with the following limitations:

- Automatic coverage configuration and reporting is not supported.
- When building a multi-module project, every module is reported in a separate trace.

## Setup{% #setup %}

You may follow interactive setup steps on the [Datadog site](https://app.datadoghq.com/ci/setup/test?language=java) or the instructions below.

Configuring the Datadog Java Tracer varies depending on your CI provider.

{% tab title="CI Provider with Auto-Instrumentation Support" %}
We support auto-instrumentation for the following CI providers:

| CI Provider    | Auto-Instrumentation method                                                                                                                         |
| -------------- | --------------------------------------------------------------------------------------------------------------------------------------------------- |
| GitHub Actions | [Datadog Test Visibility Github Action](https://github.com/marketplace/actions/configure-datadog-test-visibility)                                   |
| Jenkins        | [UI-based configuration](https://docs.datadoghq.com/continuous_integration/pipelines/jenkins/#enable-test-optimization) with Datadog Jenkins plugin |
| GitLab         | [Datadog Test Visibility GitLab Script](https://github.com/DataDog/test-visibility-gitlab-script)                                                   |
| CircleCI       | [Datadog Test Visibility CircleCI Orb](https://circleci.com/orbs/registry/orb/datadog/test-visibility-circleci-orb)                                 |

If you are using auto-instrumentation for one of these providers, you can skip the rest of the setup steps below.
{% /tab %}

{% tab title="Other Cloud CI Provider" %}
If you are using a cloud CI provider without access to the underlying worker nodes, such as GitHub Actions or CircleCI, configure the library to use the Agentless mode. For this, set the following environment variables:

{% dl %}

{% dt %}
`DD_CIVISIBILITY_AGENTLESS_ENABLED=true` (Required)
{% /dt %}

{% dd %}
Enables or disables Agentless mode.**Default**: `false`
{% /dd %}

{% dt %}
`DD_API_KEY` (Required)
{% /dt %}

{% dd %}
The [Datadog API key](https://app.datadoghq.com/organization-settings/api-keys) used to upload the test results.**Default**: `(empty)`
{% /dd %}

{% /dl %}

Additionally, configure the [Datadog site](https://docs.datadoghq.com/getting_started/site/) to which you want to send data.

{% dl %}

{% dt %}
`DD_SITE` (Required)
{% /dt %}

{% dd %}
The [Datadog site](https://docs.datadoghq.com/getting_started/site/) to upload results to.**Default**: `datadoghq.com`
{% /dd %}

{% /dl %}

{% /tab %}

{% tab title="On-Premises CI Provider" %}
If you are running tests on an on-premises CI provider, such as Jenkins or self-managed GitLab CI, install the Datadog Agent on each worker node by following the [Agent installation instructions](https://docs.datadoghq.com/agent/). This is the recommended option as it allows you to automatically link test results to [logs](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/) and [underlying host metrics](https://docs.datadoghq.com/infrastructure/).

If you are using a Kubernetes executor, Datadog recommends using the [Datadog Operator](https://docs.datadoghq.com/containers/datadog_operator/). The operator includes [Datadog Admission Controller](https://docs.datadoghq.com/agent/cluster_agent/admission_controller/) which can automatically [inject the tracer library](https://docs.datadoghq.com/tracing/trace_collection/library_injection_local/?tab=kubernetes) into the build pods. **Note:** If you use the Datadog Operator, there is no need to download and inject the tracer library since the Admission Controller can do this for you, so you can skip the corresponding step below. However, you still need to make sure that your pods set the environment variables or command-line parameters necessary to enable Test Visibility.

If you are not using Kubernetes or can't use the Datadog Admission Controller and the CI provider is using a container-based executor, set the `DD_TRACE_AGENT_URL` environment variable (which defaults to `http://localhost:8126`) in the build container running the tracer to an endpoint that is accessible from within that container. **Note:** Using `localhost` inside the build references the container itself and not the underlying worker node or any container where the Agent might be running in.

`DD_TRACE_AGENT_URL` includes the protocol and port (for example, `http://localhost:8126`) and takes precedence over `DD_AGENT_HOST` and `DD_TRACE_AGENT_PORT`, and is the recommended configuration parameter to configure the Datadog Agent's URL for CI Visibility.

If you still have issues connecting to the Datadog Agent, use the Agentless Mode. **Note:** When using this method, tests are not correlated with [logs](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/) and [infrastructure metrics](https://docs.datadoghq.com/infrastructure/).
{% /tab %}

### Downloading tracer library{% #downloading-tracer-library %}

You only need to download the tracer library once for each server.

If the tracer library is already available locally on the server, you can proceed directly to running the tests.

Declare `DD_TRACER_FOLDER` variable with the path to the folder where you want to store the downloaded tracer JAR:

```shell
export DD_TRACER_FOLDER=... // e.g. ~/.datadog
```

Run the command below to download the tracer JAR to the specified folder:

```shell
wget -O $DD_TRACER_FOLDER/dd-java-agent.jar 'https://dtdg.co/latest-java-tracer'
```

You can run the `java -jar $DD_TRACER_FOLDER/dd-java-agent.jar` command to check the version of the tracer library.

### Running your tests{% #running-your-tests %}

{% tab title="Maven" %}
Set the following environment variables to configure the tracer:

{% dl %}

{% dt %}
`DD_CIVISIBILITY_ENABLED=true` (Required)
{% /dt %}

{% dd %}
Enables the Test Optimization product.
{% /dd %}

{% dt %}
`DD_ENV`
{% /dt %}

{% dd %}
Environment where the tests are being run (for example: `local` when running tests on a developer workstation or `ci` when running them on a CI provider).
{% /dd %}

{% dt %}
`DD_SERVICE`
{% /dt %}

{% dd %}
Name of the service or library being tested.
{% /dd %}

{% dt %}
`DD_TRACER_FOLDER` (Required)
{% /dt %}

{% dd %}
Path to the folder where the downloaded Java Tracer is located.
{% /dd %}

{% dt %}
`MAVEN_OPTS=-javaagent:$DD_TRACER_FOLDER/dd-java-agent.jar` (Required)
{% /dt %}

{% dd %}
Injects the tracer into the Maven build process.
{% /dd %}

{% dt %}
`DD_TEST_SESSION_NAME`
{% /dt %}

{% dd %}
Identifies a group of tests (for example: `unit-tests` or `integration-tests`).
{% /dd %}

{% /dl %}

Run your tests as you normally do (for example: `mvn test` or `mvn verify`).
{% /tab %}

{% tab title="Gradle" %}
Set the following environment variables to configure the tracer:

{% dl %}

{% dt %}
`DD_CIVISIBILITY_ENABLED=true` (Required)
{% /dt %}

{% dd %}
Enables the Test Optimization product.
{% /dd %}

{% dt %}
`DD_ENV`
{% /dt %}

{% dd %}
Environment where the tests are being run (for example: `local` when running tests on a developer workstation or `ci` when running them on a CI provider).
{% /dd %}

{% dt %}
`DD_SERVICE`
{% /dt %}

{% dd %}
Name of the service or library being tested.
{% /dd %}

{% dt %}
`DD_TRACER_FOLDER` (Required)
{% /dt %}

{% dd %}
Path to the folder where the downloaded Java Tracer is located.
{% /dd %}

{% dt %}
`GRADLE_OPTS=-javaagent:$DD_TRACER_FOLDER/dd-java-agent.jar` (Required)
{% /dt %}

{% dd %}
Injects the tracer into the Gradle launcher process.
{% /dd %}

{% /dl %}

Run your tests as you normally do (for example: `./gradlew clean test`).
{% /tab %}

{% tab title="SBT" %}
Set the following environment variables to configure the tracer:

{% dl %}

{% dt %}
`DD_CIVISIBILITY_ENABLED=true` (Required)
{% /dt %}

{% dd %}
Enables the Test Optimization product.
{% /dd %}

{% dt %}
`DD_TEST_SESSION_NAME`
{% /dt %}

{% dd %}
Identifies a group of tests (for example: `unit-tests` or `integration-tests`).
{% /dd %}

{% dt %}
`DD_ENV`
{% /dt %}

{% dd %}
Environment where the tests are being run (for example: `local` when running tests on a developer workstation or `ci` when running them on a CI provider).
{% /dd %}

{% dt %}
`DD_SERVICE`
{% /dt %}

{% dd %}
Name of the service or library being tested.
{% /dd %}

{% dt %}
`DD_TRACER_FOLDER` (Required)
{% /dt %}

{% dd %}
Path to the folder where the downloaded Java Tracer is located.
{% /dd %}

{% dt %}
`SBT_OPTS=-javaagent:$DD_TRACER_FOLDER/dd-java-agent.jar` (Required)
{% /dt %}

{% dd %}
Injects the tracer into the JVMs that execute your tests.
{% /dd %}

{% /dl %}

Run your tests as you normally do (for example: `sbt test`).
{% /tab %}

{% tab title="Other" %}
Set the following environment variables to configure the tracer:

{% dl %}

{% dt %}
`DD_CIVISIBILITY_ENABLED=true` (Required)
{% /dt %}

{% dd %}
Enables the Test Optimization product.
{% /dd %}

{% dt %}
`DD_TEST_SESSION_NAME`
{% /dt %}

{% dd %}
Identifies a group of tests (for example: `unit-tests` or `integration-tests`).
{% /dd %}

{% dt %}
`DD_ENV`
{% /dt %}

{% dd %}
Environment where the tests are being run (for example: `local` when running tests on a developer workstation or `ci` when running them on a CI provider).
{% /dd %}

{% dt %}
`DD_SERVICE`
{% /dt %}

{% dd %}
Name of the service or library being tested.
{% /dd %}

{% dt %}
`DD_TRACER_FOLDER` (Required)
{% /dt %}

{% dd %}
Path to the folder where the downloaded Java Tracer is located.
{% /dd %}

{% dt %}
`JAVA_TOOL_OPTIONS=-javaagent:$DD_TRACER_FOLDER/dd-java-agent.jar` (Required)
{% /dt %}

{% dd %}
Injects the tracer into the JVMs that execute your tests.
{% /dd %}

{% /dl %}

Run your tests as you normally do.
{% /tab %}

## Configuration{% #configuration %}

Default configuration values work well in most cases.

However, if there is a need to fine-tune the tracer's behavior, [Datadog Tracer configuration](https://docs.datadoghq.com/tracing/trace_collection/library_config/java/?tab=containers#configuration) options can be used.

### Collecting Git metadata{% #collecting-git-metadata %}

Datadog uses Git information for visualizing your test results and grouping them by repository, branch, and commit. Git metadata is automatically collected by the test instrumentation from CI provider environment variables and the local `.git` folder in the project path, if available.

If you are running tests in non-supported CI providers or with no `.git` folder, you can set the Git information manually using environment variables. These environment variables take precedence over any auto-detected information. Set the following environment variables to provide Git information:

{% dl %}

{% dt %}
`DD_GIT_REPOSITORY_URL`
{% /dt %}

{% dd %}
URL of the repository where the code is stored. Both HTTP and SSH URLs are supported.**Example**: `git@github.com:MyCompany/MyApp.git`, `https://github.com/MyCompany/MyApp.git`
{% /dd %}

{% dt %}
`DD_GIT_BRANCH`
{% /dt %}

{% dd %}
Git branch being tested. Leave empty if providing tag information instead.**Example**: `develop`
{% /dd %}

{% dt %}
`DD_GIT_TAG`
{% /dt %}

{% dd %}
Git tag being tested (if applicable). Leave empty if providing branch information instead.**Example**: `1.0.1`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_SHA`
{% /dt %}

{% dd %}
Full commit hash.**Example**: `a18ebf361cc831f5535e58ec4fae04ffd98d8152`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_MESSAGE`
{% /dt %}

{% dd %}
Commit message.**Example**: `Set release number`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_AUTHOR_NAME`
{% /dt %}

{% dd %}
Commit author name.**Example**: `John Smith`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_AUTHOR_EMAIL`
{% /dt %}

{% dd %}
Commit author email.**Example**: `john@example.com`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_AUTHOR_DATE`
{% /dt %}

{% dd %}
Commit author date in ISO 8601 format.**Example**: `2021-03-12T16:00:28Z`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_COMMITTER_NAME`
{% /dt %}

{% dd %}
Commit committer name.**Example**: `Jane Smith`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_COMMITTER_EMAIL`
{% /dt %}

{% dd %}
Commit committer email.**Example**: `jane@example.com`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_COMMITTER_DATE`
{% /dt %}

{% dd %}
Commit committer date in ISO 8601 format.**Example**: `2021-03-12T16:00:28Z`
{% /dd %}

{% /dl %}

## Extensions{% #extensions %}

The tracer exposes a set of APIs that can be used to extend its functionality programmatically.

### Adding custom tags to tests{% #adding-custom-tags-to-tests %}

{% tab title="OpenTelemetry API" %}
To add custom tags, include the [opentelemetry-api](https://mvnrepository.com/artifact/io.opentelemetry/opentelemetry-api) library as a compile-time dependency and set `dd.trace.otel.enabled` (system property) or `DD_TRACE_OTEL_ENABLED` (environment variable) to `true`.

You can then add custom tags to your tests by using the active span:

```java
import io.opentelemetry.api.trace.Span;

// ...
// inside your test
Span span = Span.current();
span.setAttribute("test_owner", "my_team");
// test continues normally
// ...
```

For more information about adding tags, see the [Adding Tags](https://docs.datadoghq.com/tracing/trace_collection/custom_instrumentation/java?tab=locally#adding-tags) section of the Java custom instrumentation documentation.
{% /tab %}

{% tab title="OpenTracing API" %}
To add custom tags, include the [opentracing-util](https://mvnrepository.com/artifact/io.opentracing/opentracing-util) library as a compile-time dependency to your project.

You can then add custom tags to your tests by using the active span:

```java
import io.opentracing.Span;
import io.opentracing.util.GlobalTracer;

// ...
// inside your test
final Span span = GlobalTracer.get().activeSpan();
if (span != null) {
  span.setTag("test_owner", "my_team");
}
// test continues normally
// ...
```

To create filters or `group by` fields for these tags, you must first create facets.

For more information about adding tags, see the [Adding Tags](https://docs.datadoghq.com/tracing/trace_collection/custom_instrumentation/java?tab=locally#adding-tags) section of the Java custom instrumentation documentation.
{% /tab %}

### Adding custom measures to tests{% #adding-custom-measures-to-tests %}

Just like tags, you can add custom measures to your tests by using the current active span:

{% tab title="OpenTelemetry API" %}

```java
import io.opentelemetry.api.trace.Span;

// ...
// inside your test
Span span = Span.current();
span.setAttribute("test.memory.usage", 1e8);
// test continues normally
// ...
```

{% /tab %}

{% tab title="OpenTracing API" %}

```java
import io.opentracing.Span;
import io.opentracing.util.GlobalTracer;

// ...
// inside your test
final Span span = GlobalTracer.get().activeSpan();
if (span != null) {
  span.setTag("test.memory.usage", 1e8);
}
// test continues normally
// ...
```

{% /tab %}

For more information about custom measures, see the [Add Custom Measures guide](https://docs.datadoghq.com/tests/guides/add_custom_measures/?tab=java).

### Using manual testing API{% #using-manual-testing-api %}

If you use one of the supported testing frameworks, the Java Tracer automatically instruments your tests and sends the results to the Datadog backend.

If you are using a framework that is not supported, or an ad-hoc testing solution, you can harness the manual testing API, which also reports test results to the backend.

To use the manual testing API, add the [`dd-trace-api`](https://mvnrepository.com/artifact/com.datadoghq/dd-trace-api) library as a compile-time dependency to your project.

#### Domain model{% #domain-model %}

The API is based around four concepts: test session, test module, test suite, and test.

##### Test session{% #test-session %}

A test session represents a project build, which typically corresponds to execution of a test command issued by a user or by a CI script.

To start a test session, call `datadog.trace.api.civisibility.CIVisibility#startSession` and pass the name of the project and the name of the testing framework you used.

When all your tests have finished, call `datadog.trace.api.civisibility.DDTestSession#end`, which forces the library to send all remaining test results to the backend.

##### Test module{% #test-module %}

A test module represents a smaller unit of work within a project build, typically corresponding to a project module. For example, a Maven submodule or Gradle subproject.

To start a test mode, call `datadog.trace.api.civisibility.DDTestSession#testModuleStart` and pass the name of the module.

When the module has finished building and testing, call `datadog.trace.api.civisibility.DDTestModule#end`.

##### Test Suite{% #test-suite %}

A test suite comprises a set of tests that share common functionality. They can share a common initialization and teardown, and can also share some variables. A single suite usually corresponds to a Java class that contains test cases.

Create test suites in a test module by calling `datadog.trace.api.civisibility.DDTestModule#testSuiteStart` and passing the name of the test suite.

Call `datadog.trace.api.civisibility.DDTestSuite#end` when all the related tests in the suite have finished their execution.

##### Test{% #test %}

A test represents a single test case that is executed as part of a test suite. Usually it corresponds to a method that contains testing logic.

Create tests in a suite by calling `datadog.trace.api.civisibility.DDTestSuite#testStart` and passing the name of the test.

Call `datadog.trace.api.civisibility.DDTest#end` when a test has finished execution.

#### Code Example{% #code-example %}

The following code represents a simple usage of the API:

```java
package com.datadog.civisibility.example;

import datadog.trace.api.civisibility.CIVisibility;
import datadog.trace.api.civisibility.DDTest;
import datadog.trace.api.civisibility.DDTestModule;
import datadog.trace.api.civisibility.DDTestSession;
import datadog.trace.api.civisibility.DDTestSuite;
import java.lang.reflect.Method;

// the null arguments in the calls below are optional startTime/endTime values:
// when they are not specified, current time is used
public class ManualTest {
    public static void main(String[] args) throws Exception {
        DDTestSession testSession = CIVisibility.startSession("my-project-name", "my-test-framework", null);
        testSession.setTag("my-tag", "additional-session-metadata");
        try {
            runTestModule(testSession);
        } finally {
            testSession.end(null);
        }
    }

    private static void runTestModule(DDTestSession testSession) throws Exception {
        DDTestModule testModule = testSession.testModuleStart("my-module", null);
        testModule.setTag("my-module-tag", "additional-module-metadata");
        try {
            runFirstTestSuite(testModule);
            runSecondTestSuite(testModule);
        } finally {
            testModule.end(null);
        }
    }

    private static void runFirstTestSuite(DDTestModule testModule) throws Exception {
        DDTestSuite testSuite = testModule.testSuiteStart("my-suite", ManualTest.class, null);
        testSuite.setTag("my-suite-tag", "additional-suite-metadata");
        try {
            runTestCase(testSuite);
        } finally {
            testSuite.end(null);
        }
    }

    private static void runTestCase(DDTestSuite testSuite) throws Exception {
        Method myTestCaseMethod = ManualTest.class.getDeclaredMethod("myTestCase");
        DDTest ddTest = testSuite.testStart("myTestCase", myTestCaseMethod, null);
        ddTest.setTag("my-test-case-tag", "additional-test-case-metadata");
        ddTest.setTag("my-test-case-tag", "more-test-case-metadata");
        try {
            myTestCase();
        } catch (Exception e) {
            ddTest.setErrorInfo(e); // pass error info to mark test case as failed
        } finally {
            ddTest.end(null);
        }
    }

    private static void myTestCase() throws Exception {
        // run some test logic
    }

    private static void runSecondTestSuite(DDTestModule testModule) {
        DDTestSuite secondTestSuite = testModule.testSuiteStart("my-second-suite", ManualTest.class, null);
        secondTestSuite.setSkipReason("this test suite is skipped"); // pass skip reason to mark test suite as skipped
        secondTestSuite.end(null);
    }
}
```

Always call `datadog.trace.api.civisibility.DDTestSession#end` at the end so that all the test info is flushed to Datadog.

## Best practices{% #best-practices %}

### Deterministic test parameters representation{% #deterministic-test-parameters-representation %}

Test Optimization works best when the [test parameters are deterministic](https://docs.datadoghq.com/tests/#parameterized-test-configurations) and stay the same between test runs. If a test case has a parameter that varies between test executions (such as a current date, a random number, or an instance of a class whose `toString()` method is not overridden), some of the product features may not work as expected. For example, the history of executions may not be available, or the test case may not be classified as flaky even if it exhibits flakiness.

The best way to fix this is to make sure that the test parameters are the same between test runs.

In JUnit 5, this can also be addressed by [customizing the string representation of the test parameters](https://junit.org/junit5/docs/current/user-guide/#writing-tests-parameterized-tests-display-names) without changing their values. To do so, use `org.junit.jupiter.api.Named` interface or change the `name` parameter of the `org.junit.jupiter.params.ParameterizedTest` annotation:

```java
@ParameterizedTest
@MethodSource("namedArguments")
void parameterizedTest(String s, Date d) {
   // The second parameter in this test case is non-deterministic.
   // In the argument provider method it is wrapped with Named to ensure it has a deterministic name.
}

static Stream<Arguments> namedArguments() {
    return Stream.of(
            Arguments.of(
                    "a string",
                    Named.of("current date", new Date())),
            Arguments.of(
                    "another string",
                    Named.of("a date in the future", new Date(System.currentTimeMillis() + TimeUnit.DAYS.toMillis(1))))
    );
}
```

```java
@ParameterizedTest(name = "[{index}] {0}, a random number from one to ten")
@MethodSource("randomArguments")
void anotherParameterizedTest(String s, int i) {
  // The second parameter in this test case is non-deterministic.
  // The name of the parameterized test is customized to ensure it has a deterministic name.
}

static Stream<Arguments> randomArguments() {
    return Stream.of(
            Arguments.of("a string", ThreadLocalRandom.current().nextInt(10) + 1),
            Arguments.of("another string", ThreadLocalRandom.current().nextInt(10) + 1)
    );
}
```

### Test session name `DD_TEST_SESSION_NAME`{% #test-session-name-dd_test_session_name %}

Use `DD_TEST_SESSION_NAME` to define the name of the test session and the related group of tests. Examples of values for this tag would be:

- `unit-tests`
- `integration-tests`
- `smoke-tests`
- `flaky-tests`
- `ui-tests`
- `backend-tests`

If `DD_TEST_SESSION_NAME` is not specified, the default value used is a combination of the:

- CI job name
- Command used to run the tests (such as `mvn test`)

The test session name needs to be unique within a repository to help you distinguish different groups of tests.

#### When to use `DD_TEST_SESSION_NAME`{% #when-to-use-dd_test_session_name %}

There's a set of parameters that Datadog checks to establish correspondence between test sessions. The test command used to execute the tests is one of them. If the test command contains a string that changes for every execution, such as a temporary folder, Datadog considers the sessions to be unrelated to each other. For example:

- `mvn test --temp-dir=/var/folders/t1/rs2htfh55mz9px2j4prmpg_c0000gq/T`

Datadog recommends using `DD_TEST_SESSION_NAME` if your test commands vary between executions.

## Troubleshooting{% #troubleshooting %}

### The tests are not appearing in Datadog after enabling Test Optimization in the tracer{% #the-tests-are-not-appearing-in-datadog-after-enabling-test-optimization-in-the-tracer %}

Verify that the tracer is injected into your build process by examining your build's logs. If the injection is successful, you can see a line containing `DATADOG TRACER CONFIGURATION`. If the line is not there, make sure that the environment variables used to inject and configure the tracer are available to the build process. A common mistake is to set the variables in a build step and run the tests in another build step. This approach may not work if the variables are not propagated between build steps.

Ensure that you are using the latest version of the tracer.

Verify that your build system and testing framework are supported by Test Optimization. See the list of supported build systems and test frameworks.

Ensure that the `dd.civisibility.enabled` property (or `DD_CIVISIBILITY_ENABLED` environment variable) is set to `true` in the tracer arguments.

Try running your build with tracer debug logging enabled by setting the `DD_TRACE_DEBUG` environment variable to `true`. Check the build output for any errors that indicate tracer misconfiguration, such as an unset `DD_API_KEY` environment variable.

### Tests or source code compilation fails when building a project with the tracer attached{% #tests-or-source-code-compilation-fails-when-building-a-project-with-the-tracer-attached %}

By default, Test Optimization runs Java code compilation with a compiler plugin attached.

The plugin is optional, as it only serves to reduce the performance overhead.

Depending on the build configuration, adding the plugin can sometimes disrupt the compilation process.

If the plugin interferes with the build, disable it by adding `dd.civisibility.compiler.plugin.auto.configuration.enabled=false` to the list of `-javaagent` arguments (or by setting `DD_CIVISIBILITY_COMPILER_PLUGIN_AUTO_CONFIGURATION_ENABLED=false` environment variable).

### Builds fails because dd-javac-plugin-client artifact cannot be found{% #builds-fails-because-dd-javac-plugin-client-artifact-cannot-be-found %}

It is possible that the Java compiler plugin injected into the build is not available if the build uses a custom artifactory storage or if it is run in offline mode.

If this is the case, you can disable plugin injection by adding `dd.civisibility.compiler.plugin.auto.configuration.enabled=false` to the list of `-javaagent` arguments (or by setting the `DD_CIVISIBILITY_COMPILER_PLUGIN_AUTO_CONFIGURATION_ENABLED` environment variable to false).

The plugin is optional, as it only serves to reduce the performance overhead.

### Tests fail when building a project with the tracer attached{% #tests-fail-when-building-a-project-with-the-tracer-attached %}

In some cases attaching the tracer can break tests, especially if they run asserts on the internal state of the JVM or instances of third-party libraries' classes.

While the best approach is such cases is to update the tests, there is also a quicker option of disabling the tracer's third-party library integrations.

The integrations provide additional insights into what happens in the tested code and are especially useful in integration tests, to monitor things like HTTP requests or database calls. They are enabled by default.

To disable a specific integration, refer to the [Datadog Tracer Compatibility](https://docs.datadoghq.com/tracing/trace_collection/compatibility/java#integrations) table for the relevant configuration property names. For example, to disable `OkHttp3` client request integration, add `dd.integration.okhttp-3.enabled=false` to the list of `-javaagent` arguments.

To disable all integrations, augment the list of `-javaagent` arguments with `dd.trace.enabled=false` (or set `DD_TRACE_ENABLED=false` environment variable).

## Further reading{% #further-reading %}

- [Forwarding Environment Variables for Tests in Containers](https://docs.datadoghq.com/tests/containers/)
- [Explore Test Results and Performance](https://docs.datadoghq.com/tests/explorer)
- [Detect test flakiness with Early Flake Detection](https://docs.datadoghq.com/tests/flaky_test_management/early_flake_detection)
- [Retry failing test cases with Auto Test Retries](https://docs.datadoghq.com/tests/flaky_test_management/auto_test_retries)
- [Correlate logs and test traces](https://docs.datadoghq.com/tests/correlate_logs_and_tests)
- [Troubleshooting Test Optimization](https://docs.datadoghq.com/tests/troubleshooting/)
