---
title: .NET Tests
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > Test Optimization in Datadog > Configure Test Optimization > .NET Tests
---

# .NET Tests

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com

{% alert level="danger" %}
This product is not supported for your selected [Datadog site](https://docs.datadoghq.com/getting_started/site). ().
{% /alert %}

{% /callout %}

## Compatibility{% #compatibility %}

For a list of supported runtimes and platforms, see [.NET Framework Compatibility](https://docs.datadoghq.com/tracing/trace_collection/compatibility/dotnet-framework/) and [.NET/.NET Core Compatiblity](https://docs.datadoghq.com/tracing/trace_collection/compatibility/dotnet-core/).

Supported test frameworks:

| Test Framework                                                                                                                                                    | Version    |
| ----------------------------------------------------------------------------------------------------------------------------------------------------------------- | ---------- |
| xUnit                                                                                                                                                             | \>= 2.2    |
| NUnit                                                                                                                                                             | \>= 3.0    |
| MsTestV2                                                                                                                                                          | \>= 14     |
| [BenchmarkDotNet](https://docs.datadoghq.com/continuous_integration/tests/setup/dotnet/?tab=onpremisesciproviderdatadogagent#instrumenting-benchmarkdotnet-tests) | \>= 0.13.2 |

## Configuring reporting method{% #configuring-reporting-method %}

To report test results to Datadog, you need to configure the Datadog .NET library:

{% tab title="CI Provider with Auto-Instrumentation Support" %}
We support auto-instrumentation for the following CI providers:

| CI Provider    | Auto-Instrumentation method                                                                                                                         |
| -------------- | --------------------------------------------------------------------------------------------------------------------------------------------------- |
| GitHub Actions | [Datadog Test Visibility Github Action](https://github.com/marketplace/actions/configure-datadog-test-visibility)                                   |
| Jenkins        | [UI-based configuration](https://docs.datadoghq.com/continuous_integration/pipelines/jenkins/#enable-test-optimization) with Datadog Jenkins plugin |
| GitLab         | [Datadog Test Visibility GitLab Script](https://github.com/DataDog/test-visibility-gitlab-script)                                                   |
| CircleCI       | [Datadog Test Visibility CircleCI Orb](https://circleci.com/orbs/registry/orb/datadog/test-visibility-circleci-orb)                                 |

If you are using auto-instrumentation for one of these providers, you can skip the rest of the setup steps below.
{% /tab %}

{% tab title="Other Cloud CI Provider" %}
If you are using a cloud CI provider without access to the underlying worker nodes, such as GitHub Actions or CircleCI, configure the library to use the Agentless mode. For this, set the following environment variables:

{% dl %}

{% dt %}
`DD_CIVISIBILITY_AGENTLESS_ENABLED=true` (Required)
{% /dt %}

{% dd %}
Enables or disables Agentless mode.**Default**: `false`
{% /dd %}

{% dt %}
`DD_API_KEY` (Required)
{% /dt %}

{% dd %}
The [Datadog API key](https://app.datadoghq.com/organization-settings/api-keys) used to upload the test results.**Default**: `(empty)`
{% /dd %}

{% /dl %}

Additionally, configure the [Datadog site](https://docs.datadoghq.com/getting_started/site/) to which you want to send data.

{% dl %}

{% dt %}
`DD_SITE` (Required)
{% /dt %}

{% dd %}
The [Datadog site](https://docs.datadoghq.com/getting_started/site/) to upload results to.**Default**: `datadoghq.com`
{% /dd %}

{% /dl %}

{% /tab %}

{% tab title="On-Premises CI Provider" %}
If you are running tests on an on-premises CI provider, such as Jenkins or self-managed GitLab CI, install the Datadog Agent on each worker node by following the [Agent installation instructions](https://docs.datadoghq.com/agent/). This is the recommended option as it allows you to automatically link test results to [logs](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/) and [underlying host metrics](https://docs.datadoghq.com/infrastructure/).

If you are using a Kubernetes executor, Datadog recommends using the [Datadog Operator](https://docs.datadoghq.com/containers/datadog_operator/). The operator includes [Datadog Admission Controller](https://docs.datadoghq.com/agent/cluster_agent/admission_controller/) which can automatically [inject the tracer library](https://docs.datadoghq.com/tracing/trace_collection/library_injection_local/?tab=kubernetes) into the build pods. **Note:** If you use the Datadog Operator, there is no need to download and inject the tracer library since the Admission Controller can do this for you, so you can skip the corresponding step below. However, you still need to make sure that your pods set the environment variables or command-line parameters necessary to enable Test Visibility.

If you are not using Kubernetes or can't use the Datadog Admission Controller and the CI provider is using a container-based executor, set the `DD_TRACE_AGENT_URL` environment variable (which defaults to `http://localhost:8126`) in the build container running the tracer to an endpoint that is accessible from within that container. **Note:** Using `localhost` inside the build references the container itself and not the underlying worker node or any container where the Agent might be running in.

`DD_TRACE_AGENT_URL` includes the protocol and port (for example, `http://localhost:8126`) and takes precedence over `DD_AGENT_HOST` and `DD_TRACE_AGENT_PORT`, and is the recommended configuration parameter to configure the Datadog Agent's URL for CI Visibility.

If you still have issues connecting to the Datadog Agent, use the Agentless Mode. **Note:** When using this method, tests are not correlated with [logs](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/) and [infrastructure metrics](https://docs.datadoghq.com/infrastructure/).
{% /tab %}

## Installing the .NET tracer CLI{% #installing-the-net-tracer-cli %}

Install or update the `dd-trace` command using one of the following ways:

- Using the .NET SDK by running the command:

  ```gdscript3
  dotnet tool update -g dd-trace
  ```

- By downloading the appropriate version:

  - Win-x64: [https://dtdg.co/dd-trace-dotnet-win-x64](https://dtdg.co/dd-trace-dotnet-win-x64)
  - Linux-x64: [https://dtdg.co/dd-trace-dotnet-linux-x64](https://dtdg.co/dd-trace-dotnet-linux-x64)
  - Linux-musl-x64 (Alpine): [https://dtdg.co/dd-trace-dotnet-linux-musl-x64](https://dtdg.co/dd-trace-dotnet-linux-musl-x64)

- Or by downloading [from the GitHub release page](https://github.com/DataDog/dd-trace-dotnet/releases).

## Instrumenting tests{% #instrumenting-tests %}

{% alert level="warning" %}
For BenchmarkDotNet follow these instructions.
{% /alert %}

To instrument your test suite, prefix your test command with `dd-trace ci run`, providing the name of the service or library under test as the `--dd-service` parameter, and the environment where tests are being run (for example, `local` when running tests on a developer workstation, or `ci` when running them on a CI provider) as the `--dd-env` parameter. For example:

{% tab title="dotnet test" %}
By using [dotnet test](https://docs.microsoft.com/en-us/dotnet/core/tools/dotnet-test):

```shell
dd-trace ci run --dd-service=my-dotnet-app -- dotnet test
```

{% /tab %}

{% tab title="VSTest.Console" %}
By using [VSTest.Console.exe](https://docs.microsoft.com/en-us/visualstudio/test/vstest-console-options):

```shell
dd-trace ci run --dd-service=my-dotnet-app -- VSTest.Console.exe {test_assembly}.dll
```

{% /tab %}

All tests are automatically instrumented.

### Compatibility with Microsoft.CodeCoverage nuget package{% #compatibility-with-microsoftcodecoverage-nuget-package %}

Since `Microsoft.CodeCoverage` version `17.2.0` Microsoft introduced [dynamic instrumentation using the `.NET CLR Profiling API`](https://github.com/microsoft/codecoverage/blob/main/docs/instrumentation.md) enabled by default only on Windows. Datadog's automatic instrumentation relies on the `.NET CLR Profiling API`. This API allows only one subscriber (for example, `dd-trace`). The use of CodeCoverage dynamic instrumentation breaks the automatic test instrumentation.

The solution is to switch from dynamic instrumentation to [static instrumentation](https://github.com/microsoft/codecoverage/blob/main/samples/Calculator/scenarios/scenario07/README.md). Modify your `.runsettings` file with the following configuration knobs:

```xml
<?xml version="1.0" encoding="utf-8"?>
<RunSettings>
    <DataCollectionRunSettings>
        <DataCollectors>
            <DataCollector friendlyName="Code Coverage">
              <Configuration>
                <CodeCoverage>
                  <!-- Switching to static instrumentation (dynamic instrumentation collides with dd-trace instrumentation) -->
                  <EnableStaticManagedInstrumentation>True</EnableStaticManagedInstrumentation>
                  <EnableDynamicManagedInstrumentation>False</EnableDynamicManagedInstrumentation>
                  <UseVerifiableInstrumentation>False</UseVerifiableInstrumentation>
                  <EnableStaticNativeInstrumentation>True</EnableStaticNativeInstrumentation>
                  <EnableDynamicNativeInstrumentation>False</EnableDynamicNativeInstrumentation>
                  ...
                </CodeCoverage>
              </Configuration>
            </DataCollector>
        </DataCollectors>
    </DataCollectionRunSettings>
</RunSettings>
```

## Configuration settings{% #configuration-settings %}

You can change the default configuration of the CLI by using command line arguments or environment variables. For a full list of configuration settings, run:

```shell
dd-trace ci run --help
```

The following list shows the default values for key configuration settings:

{% dl %}

{% dt %}
`--dd-service`
{% /dt %}

{% dd %}
Name of the service or library under test.**Environment variable**: `DD_SERVICE`**Default**: The repository name**Example**: `my-dotnet-app`
{% /dd %}

{% dt %}
`--dd-env`
{% /dt %}

{% dd %}
Name of the environment where tests are being run.**Environment variable**: `DD_ENV`**Default**: `none`**Examples**: `local`, `ci`
{% /dd %}

{% dt %}
`--agent-url`
{% /dt %}

{% dd %}
Datadog Agent URL for trace collection in the form `http://hostname:port`.**Environment variable**: `DD_TRACE_AGENT_URL`**Default**: `http://localhost:8126`
{% /dd %}

{% dt %}
`test_session.name` (only available as an environment variable)
{% /dt %}

{% dd %}
Identifies a group of tests, such as `integration-tests`, `unit-tests` or `smoke-tests`.**Environment variable**: `DD_TEST_SESSION_NAME`**Default**: (CI job name + test command)**Example**: `unit-tests`, `integration-tests`, `smoke-tests`
{% /dd %}

{% /dl %}

For more information about `service` and `env` reserved tags, see [Unified Service Tagging](https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging). All other [Datadog Tracer configuration](https://docs.datadoghq.com/tracing/trace_collection/dd_libraries/dotnet-core/?tab=windows#configuration) options can also be used.

### Adding custom tags to tests{% #adding-custom-tags-to-tests %}

To add custom tags to tests, configure custom instrumentation first.

You can add custom tags to your tests by using the current active span:

```csharp
// inside your test
var scope = Tracer.Instance.ActiveScope; // from Datadog.Trace;
if (scope != null) {
    scope.Span.SetTag("test_owner", "my_team");
}
// test continues normally
// ...
```

To create filters or `group by` fields for these tags, you must first create facets. For more information about adding tags, see the [Adding Tags](https://docs.datadoghq.com/tracing/trace_collection/custom_instrumentation/dotnet?tab=locally#adding-tags) section of the .NET custom instrumentation documentation.

### Adding custom measures to tests{% #adding-custom-measures-to-tests %}

To add custom measures to tests, configure custom instrumentation first.

Just like tags, you can add custom measures to your tests by using the current active span:

```csharp
// inside your test
var scope = Tracer.Instance.ActiveScope; // from Datadog.Trace;
if (scope != null) {
    scope.Span.SetTag("memory_allocations", 16);
}
// test continues normally
// ...
```

To create filters or visualizations for these tags, you must first create facets. For more information about adding tags, see the [Adding Tags](https://docs.datadoghq.com/tracing/trace_collection/custom_instrumentation/dotnet?tab=locally#adding-tags) section of the .NET custom instrumentation documentation.

Read more about custom Measures in the [Add Custom Measures Guide](https://docs.datadoghq.com/tests/guides/add_custom_measures/?tab=net).

### Reporting code coverage{% #reporting-code-coverage %}

When code coverage is available, the Datadog Tracer (v2.31.0 or later) reports it under the `test.code_coverage.lines_pct` tag for your test sessions.

If you are using [Coverlet](https://github.com/coverlet-coverage/coverlet) to compute your code coverage, indicate the path to the report file in the `DD_CIVISIBILITY_EXTERNAL_CODE_COVERAGE_PATH` environment variable when running `dd-trace`. The report file must be in the OpenCover or Cobertura formats. Alternatively, you can enable the Datadog Tracer's built-in code coverage calculation with the `DD_CIVISIBILITY_CODE_COVERAGE_ENABLED=true` environment variable.

**Note**: When using Test Impact Analysis, the tracer's built-in code coverage is enabled by default.

You can see the evolution of the test coverage in the **Coverage** tab of a test session.

For more information about exclusion options, see [Code Coverage](https://docs.datadoghq.com/continuous_integration/tests/code_coverage/?tab=net).

### Instrumenting BenchmarkDotNet tests{% #instrumenting-benchmarkdotnet-tests %}

To instrument your benchmark tests, you need to:

1. Add the [`Datadog.Trace.BenchmarkDotNet` NuGet package](https://www.nuget.org/packages/Datadog.Trace.BenchmarkDotNet) to your project (for example, using `dotnet add package Datadog.Trace.BenchmarkDotNet`).
1. Configure your project to use the `Datadog.Trace.BenchmarkDotNet` exporter using the `DatadogDiagnoser` attribute or the `WithDatadog()` extension method. For example:

{% tab title="Using the [DatadogDiagnoser] Attribute" %}

```csharp
using BenchmarkDotNet.Attributes;
using Datadog.Trace.BenchmarkDotNet;

[DatadogDiagnoser]
[MemoryDiagnoser]
public class OperationBenchmark
{
    [Benchmark]
    public void Operation()
    {
        // ...
    }
}
```

{% /tab %}

{% tab title="Using the Configuration" %}

```csharp
using BenchmarkDotNet.Configs;
using BenchmarkDotNet.Running;
using Datadog.Trace.BenchmarkDotNet;

var config = DefaultConfig.Instance
              .WithDatadog();

BenchmarkRunner.Run<OperationBenchmark>(config);
```

{% /tab %}
[Configure the reporting method](https://docs.datadoghq.com/continuous_integration/tests/dotnet/#configuring-reporting-method).Run the benchmark project as you normally do, all benchmark tests will be automatically instrumented.
Datadog uses Git information for visualizing your test results and grouping them by repository, branch, and commit. Git metadata is automatically collected by the test instrumentation from CI provider environment variables and the local `.git` folder in the project path, if available.

If you are running tests in non-supported CI providers or with no `.git` folder, you can set the Git information manually using environment variables. These environment variables take precedence over any auto-detected information. Set the following environment variables to provide Git information:

{% dl %}

{% dt %}
`DD_GIT_REPOSITORY_URL`
{% /dt %}

{% dd %}
URL of the repository where the code is stored. Both HTTP and SSH URLs are supported.**Example**: `git@github.com:MyCompany/MyApp.git`, `https://github.com/MyCompany/MyApp.git`
{% /dd %}

{% dt %}
`DD_GIT_BRANCH`
{% /dt %}

{% dd %}
Git branch being tested. Leave empty if providing tag information instead.**Example**: `develop`
{% /dd %}

{% dt %}
`DD_GIT_TAG`
{% /dt %}

{% dd %}
Git tag being tested (if applicable). Leave empty if providing branch information instead.**Example**: `1.0.1`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_SHA`
{% /dt %}

{% dd %}
Full commit hash.**Example**: `a18ebf361cc831f5535e58ec4fae04ffd98d8152`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_MESSAGE`
{% /dt %}

{% dd %}
Commit message.**Example**: `Set release number`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_AUTHOR_NAME`
{% /dt %}

{% dd %}
Commit author name.**Example**: `John Smith`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_AUTHOR_EMAIL`
{% /dt %}

{% dd %}
Commit author email.**Example**: `john@example.com`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_AUTHOR_DATE`
{% /dt %}

{% dd %}
Commit author date in ISO 8601 format.**Example**: `2021-03-12T16:00:28Z`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_COMMITTER_NAME`
{% /dt %}

{% dd %}
Commit committer name.**Example**: `Jane Smith`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_COMMITTER_EMAIL`
{% /dt %}

{% dd %}
Commit committer email.**Example**: `jane@example.com`
{% /dd %}

{% dt %}
`DD_GIT_COMMIT_COMMITTER_DATE`
{% /dt %}

{% dd %}
Commit committer date in ISO 8601 format.**Example**: `2021-03-12T16:00:28Z`
{% /dd %}

{% /dl %}

## Custom instrumentation{% #custom-instrumentation %}

{% alert level="danger" %}
**Note:** Your custom instrumentation setup depends on the `dd-trace` version. To use the custom instrumentation, you must keep the package versions for `dd-trace` and `Datadog.Trace` NuGet packages in sync.
{% /alert %}

To use the custom instrumentation in your .NET application:

1. Execute `dd-trace --version` to get the version of the tool.
1. Add the `Datadog.Trace` [NuGet package](https://www.nuget.org/packages/Datadog.Trace) with the same version to your application.
1. In your application code, access the global tracer through the `Datadog.Trace.Tracer.Instance` property to create new spans.

For more information about how to add spans and tags for custom instrumentation, see the [.NET Custom Instrumentation documentation](https://docs.datadoghq.com/tracing/trace_collection/custom_instrumentation/dotnet/).

## Manual testing API{% #manual-testing-api %}

{% alert level="danger" %}
**Note:** To use the manual testing API, you must add the `Datadog.Trace` NuGet package in the target .NET project.
{% /alert %}

If you use XUnit, NUnit, or MSTest with your .NET projects, Test Optimization automatically instruments them and sends the test results to Datadog. If you use an unsupported testing framework or if you have a different testing mechanism, you can instead use the API to report test results to Datadog.

The API is based around three concepts: test module, test suites, and tests.

### Test module{% #test-module %}

A test module represents the .NET assembly that includes the tests.

To start a test module, call `TestModule.Create()` and pass the name of the module or .NET assembly name where the tests are located.

When all your tests have finished, call `module.Close()` or `module.CloseAsync()`, which forces the library to send all remaining test results to the backend.

### Test suites{% #test-suites %}

A test suite comprises a set of tests. They can have a common initialization and teardown methods and share some variables. In .NET, they are usually implemented as a Test class or fixture containing multiple test methods. A test suite can optionally have additional information like attributes or error information.

Create test suites in the test module by calling `module.GetOrCreateSuite()` and passing the name of the test suite.

Call `suite.Close()` when all the related tests in the suite have finished their execution.

### Tests{% #tests %}

Each test runs inside a suite and must end in one of these three statuses: `TestStatus.Pass`, `TestStatus.Fail`, or `TestStatus.Skip`.

A test can optionally have additional information like:

- Parameters
- Attributes
- Error information
- Test traits
- Benchmark data

Create tests in a suite by calling `suite.CreateTest()` and passing the name of the test. When a test ends, call `test.Close()` with one of the predefined statuses.

### Code example{% #code-example %}

The following code represents a simple usage of the API:

```csharp
using System.Reflection;
using Datadog.Trace.Ci;

var module = TestModule.Create(Assembly.GetExecutingAssembly().GetName().Name ?? "(dyn_module)");
module.SetTag("ModuleTag", "Value");

var suite = module.GetOrCreateSuite("MySuite");
suite.SetTag("SuiteTag", 42);

var test = suite.CreateTest("Test01");
test.SetTag("TestTag", "Value");
test.SetParameters(new TestParameters
{
    Arguments = new Dictionary<string, object>
    {
        ["a"] = 42,
        ["b"] = 0,
    }
});
test.SetTraits(new Dictionary<string, List<string>>
{
    ["Category"] = new () { "UnitTest" }
});

try
{
    var a = 42;
    var b = 0;
    var c = a / b;
}
catch (Exception ex)
{
    test.SetErrorInfo(ex);
}

test.Close(TestStatus.Fail);
suite.Close();
await module.CloseAsync();
```

Always call `module.Close()` or `module.CloseAsync()` at the end so that all the test data is flushed to Datadog.

## Best practices{% #best-practices %}

### Test session name `DD_TEST_SESSION_NAME`{% #test-session-name-dd_test_session_name %}

Use `DD_TEST_SESSION_NAME` to define the name of the test session and the related group of tests. Examples of values for this tag would be:

- `unit-tests`
- `integration-tests`
- `smoke-tests`
- `flaky-tests`
- `ui-tests`
- `backend-tests`

If `DD_TEST_SESSION_NAME` is not specified, the default value used is a combination of the:

- CI job name
- Command used to run the tests (such as `yarn test`)

The test session name needs to be unique within a repository to help you distinguish different groups of tests.

#### When to use `DD_TEST_SESSION_NAME`{% #when-to-use-dd_test_session_name %}

There's a set of parameters that Datadog checks to establish correspondence between test sessions. The test command used to execute the tests is one of them. If the test command contains a string that changes for every execution, such as a temporary folder, Datadog considers the sessions to be unrelated to each other. For example:

- `dotnet test --temp-dir=/var/folders/t1/rs2htfh55mz9px2j4prmpg_c0000gq/T`

Datadog recommends using `DD_TEST_SESSION_NAME` if your test commands vary between executions.

## Further reading{% #further-reading %}

- [Forwarding Environment Variables for Tests in Containers](https://docs.datadoghq.com/continuous_integration/tests/containers/)
- [Explore Test Results and Performance](https://docs.datadoghq.com/continuous_integration/tests)
- [Speed up your test jobs with Test Impact Analysis](https://docs.datadoghq.com/tests/test_impact_analysis/dotnet)
- [Troubleshooting Test Optimization](https://docs.datadoghq.com/tests/troubleshooting/)
