To report test results to Datadog, you need to configure the Datadog .NET library:
If you are running tests on an on-premises CI provider, such as Jenkins or self-managed GitLab CI, install the Datadog Agent on each worker node by following the Agent installation instructions. This is the recommended option as test results are then automatically linked to the underlying host metrics.
If you are using a Kubernetes executor, Datadog recommends using the Datadog Admission Controller, which automatically sets the environment variables in the build pods to communicate with the local Datadog Agent.
If you are not using Kubernetes or can’t use Datadog Admission Controller and the CI provider is using a container-based executor, set the DD_TRACE_AGENT_URL environment variable (which defaults to http://localhost:8126) in the build container running the tracer to an endpoint that is accessible from within that container. Note that using localhost inside the build references the container itself and not the underlying worker node or any container where the Agent might be running.
DD_TRACE_AGENT_URL includes the protocol and port (for example, http://localhost:8126) and takes precedence over DD_AGENT_HOST and DD_TRACE_AGENT_PORT, and is the recommended configuration parameter to configure the Datadog Agent’s URL for CI Visibility.
If you still have issues connecting to the Datadog Agent, use the Agentless Mode. Note: By using this method, there will be no correlation between tests and infrastructure metrics.
Agentless mode is available in Datadog .NET library versions >= 2.5.1
If you are using a cloud CI provider without access to the underlying worker nodes, such as GitHub Actions or CircleCI, configure the library to use the Agentless mode. For this, set the following environment variables:
DD_CIVISIBILITY_AGENTLESS_ENABLED=true (Required)
Enables or disables Agentless mode. Default: false
DD_API_KEY (Required)
The Datadog API key used to upload the test results. Default: (empty)
Additionally, configure which Datadog site to which you want to send data.
DD_SITE (Required)
The Datadog site to upload results to. Default: datadoghq.com
Installing the .NET tracer CLI
Install or update the dd-trace command using one of the following ways:
To instrument your test suite, prefix your test command with dd-trace ci run, providing the name of the service or library under test as the --dd-service parameter, and the environment where tests are being run (for example, local when running tests on a developer workstation, or ci when running them on a CI provider) as the --dd-env parameter. For example:
dd-trace ci run --dd-service=my-dotnet-app --dd-env=ci -- VSTest.Console.exe {test_assembly}.dll
All tests are automatically instrumented.
Configuration settings
You can change the default configuration of the CLI by using command line arguments or environment variables. For a full list of configuration settings, run:
dd-trace ci run --help
The following list shows the default values for key configuration settings:
--dd-service
Name of the service or library under test. Environment variable: DD_SERVICE Default: The repository name Example: my-dotnet-app
--dd-env
Name of the environment where tests are being run. Environment variable: DD_ENV Default: none Examples: local, ci
--agent-url
Datadog Agent URL for trace collection in the form http://hostname:port. Environment variable: DD_TRACE_AGENT_URL Default: http://localhost:8126
You can add custom tags to your tests by using the current active span:
// inside your testvarscope=Tracer.Instance.ActiveScope;// from Datadog.Trace;if(scope!=null){scope.Span.SetTag("test_owner","my_team");}// test continues normally// ...
To create filters or group by fields for these tags, you must first create facets. For more information about adding tags, see the Adding Tags section of the .NET custom instrumentation documentation.
Just like tags, you can add custom metrics to your tests by using the current active span:
// inside your testvarscope=Tracer.Instance.ActiveScope;// from Datadog.Trace;if(scope!=null){scope.Span.SetTag("memory_allocations",16);}// test continues normally// ...
To create filters or visualizations for these tags, you must first create facets. For more information about adding tags, see the Adding Tags section of the .NET custom instrumentation documentation.
When code coverage is available, the Datadog Tracer (v2.31.0 or later) reports it under the test.code_coverage.lines_pct tag for your test sessions.
If you are using Coverlet to compute your code coverage, indicate the path to the report file in the DD_CIVISIBILITY_EXTERNAL_CODE_COVERAGE_PATH environment variable when running dd-trace. The report file must be in the OpenCover or Cobertura formats. Alternatively, you can enable the Datadog Tracer’s built-in code coverage calculation with the DD_CIVISIBILITY_CODE_COVERAGE_ENABLED=true environment variable.
Note: When using Intelligent Test Runner, the tracer’s built-in code coverage is enabled by default.
You can see the evolution of the test coverage in the Coverage tab of a test session.
For more information about exclusion options, see Code Coverage.
Configure your project to use the Datadog.Trace.BenchmarkDotNet exporter using the DatadogDiagnoser attribute or the WithDatadog() extension method. For example:
Run the benchmark project as you normally do, all benchmark tests will be automatically instrumented.
Datadog uses Git information for visualizing your test results and grouping them by repository, branch, and commit. Git metadata is automatically collected by the test instrumentation from CI provider environment variables and the local .git folder in the project path, if available.
If you are running tests in non-supported CI providers or with no .git folder, you can set the Git information manually using environment variables. These environment variables take precedence over any auto-detected information. Set the following environment variables to provide Git information:
DD_GIT_REPOSITORY_URL
URL of the repository where the code is stored. Both HTTP and SSH URLs are supported. Example: git@github.com:MyCompany/MyApp.git, https://github.com/MyCompany/MyApp.git
DD_GIT_BRANCH
Git branch being tested. Leave empty if providing tag information instead. Example: develop
DD_GIT_TAG
Git tag being tested (if applicable). Leave empty if providing branch information instead. Example: 1.0.1
DD_GIT_COMMIT_SHA
Full commit hash. Example: a18ebf361cc831f5535e58ec4fae04ffd98d8152
DD_GIT_COMMIT_MESSAGE
Commit message. Example: Set release number
DD_GIT_COMMIT_AUTHOR_NAME
Commit author name. Example: John Smith
DD_GIT_COMMIT_AUTHOR_EMAIL
Commit author email. Example: john@example.com
DD_GIT_COMMIT_AUTHOR_DATE
Commit author date in ISO 8601 format. Example: 2021-03-12T16:00:28Z
DD_GIT_COMMIT_COMMITTER_NAME
Commit committer name. Example: Jane Smith
DD_GIT_COMMIT_COMMITTER_EMAIL
Commit committer email. Example: jane@example.com
DD_GIT_COMMIT_COMMITTER_DATE
Commit committer date in ISO 8601 format. Example: 2021-03-12T16:00:28Z
Custom instrumentation
Note: Your custom instrumentation setup depends on the dd-trace version. To use the custom instrumentation, you must keep the package versions for dd-trace and Datadog.Trace NuGet packages in sync.
To use the custom instrumentation in your .NET application:
Execute dd-trace --version to get the version of the tool.
Add the Datadog.TraceNuGet package with the same version to your application.
In your application code, access the global tracer through the Datadog.Trace.Tracer.Instance property to create new spans.
Note: To use the manual testing API, you must add the Datadog.Trace NuGet package in the target .NET project.
If you use XUnit, NUnit, or MSTest with your .NET projects, CI Visibility automatically instruments them and sends the test results to Datadog. If you use an unsupported testing framework or if you have a different testing mechanism, you can instead use the API to report test results to Datadog.
The API is based around three concepts: test module, test suites, and tests.
Test module
A test module represents the .NET assembly that includes the tests.
To start a test module, call TestModule.Create() and pass the name of the module or .NET assembly name where the tests are located.
When all your tests have finished, call module.Close() or module.CloseAsync(), which forces the library to send all remaining test results to the backend.
Test suites
A test suite comprises a set of tests. They can have a common initialization and teardown methods and share some variables. In .NET, they are usually implemented as a Test class or fixture containing multiple test methods. A test suite can optionally have additional information like attributes or error information.
Create test suites in the test module by calling module.GetOrCreateSuite() and passing the name of the test suite.
Call suite.Close() when all the related tests in the suite have finished their execution.
Tests
Each test runs inside a suite and must end in one of these three statuses: TestStatus.Pass, TestStatus.Fail, or TestStatus.Skip.
A test can optionally have additional information like:
Parameters
Attributes
Error information
Test traits
Benchmark data
Create tests in a suite by calling suite.CreateTest() and passing the name of the test. When a test ends, call test.Close() with one of the predefined statuses.
API interface
namespaceDatadog.Trace.Ci{/// <summary>/// CI Visibility test module/// </summary>publicsealedclassTestModule{/// <summary>/// Gets the test framework/// </summary>publicstring?Framework{get;}/// <summary>/// Gets the module name/// </summary>publicstringName{get;}/// <summary>/// Gets the test module start date/// </summary>publicSystem.DateTimeOffsetStartTime{get;}/// <summary>/// Close test module/// </summary>/// <remarks>Use CloseAsync() version whenever possible.</remarks>publicvoidClose(){}/// <summary>/// Close test module/// </summary>/// <remarks>Use CloseAsync() version whenever possible.</remarks>/// <param name="duration">Duration of the test module</param>publicvoidClose(System.TimeSpan?duration){}/// <summary>/// Close test module/// </summary>/// <returns>Task instance </returns>publicSystem.Threading.Tasks.TaskCloseAsync(){}/// <summary>/// Close test module/// </summary>/// <param name="duration">Duration of the test module</param>/// <returns>Task instance </returns>publicSystem.Threading.Tasks.TaskCloseAsync(System.TimeSpan?duration){}/// <summary>/// Create a new test suite for this session/// </summary>/// <param name="name">Name of the test suite</param>/// <returns>Test suite instance</returns>publicDatadog.Trace.Ci.TestSuiteGetOrCreateSuite(stringname){}/// <summary>/// Create a new test suite for this session/// </summary>/// <param name="name">Name of the test suite</param>/// <param name="startDate">Test suite start date</param>/// <returns>Test suite instance</returns>publicDatadog.Trace.Ci.TestSuiteGetOrCreateSuite(stringname,System.DateTimeOffset?startDate){}/// <summary>/// Set Error Info from Exception/// </summary>/// <param name="exception">Exception instance</param>publicvoidSetErrorInfo(System.Exceptionexception){}/// <summary>/// Set Error Info/// </summary>/// <param name="type">Error type</param>/// <param name="message">Error message</param>/// <param name="callStack">Error callstack</param>publicvoidSetErrorInfo(stringtype,stringmessage,string?callStack){}/// <summary>/// Sets a number tag into the test/// </summary>/// <param name="key">Key of the tag</param>/// <param name="value">Value of the tag</param>publicvoidSetTag(stringkey,double?value){}/// <summary>/// Sets a string tag into the test/// </summary>/// <param name="key">Key of the tag</param>/// <param name="value">Value of the tag</param>publicvoidSetTag(stringkey,string?value){}/// <summary>/// Create a new Test Module/// </summary>/// <param name="name">Test module name</param>/// <returns>New test module instance</returns>publicstaticDatadog.Trace.Ci.TestModuleCreate(stringname){}/// <summary>/// Create a new Test Module/// </summary>/// <param name="name">Test module name</param>/// <param name="framework">Testing framework name</param>/// <param name="frameworkVersion">Testing framework version</param>/// <returns>New test module instance</returns>publicstaticDatadog.Trace.Ci.TestModuleCreate(stringname,stringframework,stringframeworkVersion){}/// <summary>/// Create a new Test Module/// </summary>/// <param name="name">Test module name</param>/// <param name="framework">Testing framework name</param>/// <param name="frameworkVersion">Testing framework version</param>/// <param name="startDate">Test session start date</param>/// <returns>New test module instance</returns>publicstaticDatadog.Trace.Ci.TestModuleCreate(stringname,stringframework,stringframeworkVersion,System.DateTimeOffsetstartDate){}}/// <summary>/// CI Visibility test suite/// </summary>publicsealedclassTestSuite{/// <summary>/// Gets the test module for this suite/// </summary>publicDatadog.Trace.Ci.TestModuleModule{get;}/// <summary>/// Gets the test suite name/// </summary>publicstringName{get;}/// <summary>/// Gets the test suite start date/// </summary>publicSystem.DateTimeOffsetStartTime{get;}/// <summary>/// Close test suite/// </summary>publicvoidClose(){}/// <summary>/// Close test suite/// </summary>/// <param name="duration">Duration of the test suite</param>publicvoidClose(System.TimeSpan?duration){}/// <summary>/// Create a new test for this suite/// </summary>/// <param name="name">Name of the test</param>/// <returns>Test instance</returns>publicDatadog.Trace.Ci.TestCreateTest(stringname){}/// <summary>/// Create a new test for this suite/// </summary>/// <param name="name">Name of the test</param>/// <param name="startDate">Test start date</param>/// <returns>Test instance</returns>publicDatadog.Trace.Ci.TestCreateTest(stringname,System.DateTimeOffsetstartDate){}/// <summary>/// Set Error Info from Exception/// </summary>/// <param name="exception">Exception instance</param>publicvoidSetErrorInfo(System.Exceptionexception){}/// <summary>/// Set Error Info/// </summary>/// <param name="type">Error type</param>/// <param name="message">Error message</param>/// <param name="callStack">Error callstack</param>publicvoidSetErrorInfo(stringtype,stringmessage,string?callStack){}/// <summary>/// Sets a number tag into the test/// </summary>/// <param name="key">Key of the tag</param>/// <param name="value">Value of the tag</param>publicvoidSetTag(stringkey,double?value){}/// <summary>/// Sets a string tag into the test/// </summary>/// <param name="key">Key of the tag</param>/// <param name="value">Value of the tag</param>publicvoidSetTag(stringkey,string?value){}}/// <summary>/// CI Visibility test/// </summary>publicsealedclassTest{/// <summary>/// Gets the test name/// </summary>publicstring?Name{get;}/// <summary>/// Gets the test start date/// </summary>publicSystem.DateTimeOffsetStartTime{get;}/// <summary>/// Gets the test suite for this test/// </summary>publicDatadog.Trace.Ci.TestSuiteSuite{get;}/// <summary>/// Add benchmark data/// </summary>/// <param name="measureType">Measure type</param>/// <param name="info">Measure info</param>/// <param name="statistics">Statistics values</param>publicvoidAddBenchmarkData(Datadog.Trace.Ci.BenchmarkMeasureTypemeasureType,stringinfo,inDatadog.Trace.Ci.BenchmarkDiscreteStatsstatistics){}/// <summary>/// Close test/// </summary>/// <param name="status">Test status</param>publicvoidClose(Datadog.Trace.Ci.TestStatusstatus){}/// <summary>/// Close test/// </summary>/// <param name="status">Test status</param>/// <param name="duration">Duration of the test suite</param>publicvoidClose(Datadog.Trace.Ci.TestStatusstatus,System.TimeSpan?duration){}/// <summary>/// Close test/// </summary>/// <param name="status">Test status</param>/// <param name="duration">Duration of the test suite</param>/// <param name="skipReason">In case </param>publicvoidClose(Datadog.Trace.Ci.TestStatusstatus,System.TimeSpan?duration,string?skipReason){}/// <summary>/// Set benchmark metadata/// </summary>/// <param name="hostInfo">Host info</param>/// <param name="jobInfo">Job info</param>publicvoidSetBenchmarkMetadata(inDatadog.Trace.Ci.BenchmarkHostInfohostInfo,inDatadog.Trace.Ci.BenchmarkJobInfojobInfo){}/// <summary>/// Set Error Info from Exception/// </summary>/// <param name="exception">Exception instance</param>publicvoidSetErrorInfo(System.Exceptionexception){}/// <summary>/// Set Error Info/// </summary>/// <param name="type">Error type</param>/// <param name="message">Error message</param>/// <param name="callStack">Error callstack</param>publicvoidSetErrorInfo(stringtype,stringmessage,string?callStack){}/// <summary>/// Set Test parameters/// </summary>/// <param name="parameters">TestParameters instance</param>publicvoidSetParameters(Datadog.Trace.Ci.TestParametersparameters){}/// <summary>/// Sets a number tag into the test/// </summary>/// <param name="key">Key of the tag</param>/// <param name="value">Value of the tag</param>publicvoidSetTag(stringkey,double?value){}/// <summary>/// Sets a string tag into the test/// </summary>/// <param name="key">Key of the tag</param>/// <param name="value">Value of the tag</param>publicvoidSetTag(stringkey,string?value){}/// <summary>/// Set Test method info/// </summary>/// <param name="methodInfo">Test MethodInfo instance</param>publicvoidSetTestMethodInfo(System.Reflection.MethodInfomethodInfo){}/// <summary>/// Set Test traits/// </summary>/// <param name="traits">Traits dictionary</param>publicvoidSetTraits(System.Collections.Generic.Dictionary<string,System.Collections.Generic.List<string>>traits){}}/// <summary>/// Test status/// </summary>publicenumTestStatus{/// <summary>/// Pass test status/// </summary>Pass=0,/// <summary>/// Fail test status/// </summary>Fail=1,/// <summary>/// Skip test status/// </summary>Skip=2,}/// <summary>/// Test parameters/// </summary>publicclassTestParameters{/// <summary>/// Gets or sets the test arguments/// </summary>publicSystem.Collections.Generic.Dictionary<string,object>?Arguments{get;set;}/// <summary>/// Gets or sets the test parameters metadata/// </summary>publicSystem.Collections.Generic.Dictionary<string,object>?Metadata{get;set;}}/// <summary>/// Benchmark measurement discrete stats/// </summary>publicreadonlystructBenchmarkDiscreteStats{/// <summary>/// Kurtosis value/// </summary>publicreadonlydoubleKurtosis;/// <summary>/// Max value/// </summary>publicreadonlydoubleMax;/// <summary>/// Mean value/// </summary>publicreadonlydoubleMean;/// <summary>/// Median value/// </summary>publicreadonlydoubleMedian;/// <summary>/// Min value/// </summary>publicreadonlydoubleMin;/// <summary>/// Number of samples/// </summary>publicreadonlyintN;/// <summary>/// 90 percentile value/// </summary>publicreadonlydoubleP90;/// <summary>/// 95 percentile value/// </summary>publicreadonlydoubleP95;/// <summary>/// 99 percentile value/// </summary>publicreadonlydoubleP99;/// <summary>/// Skewness value/// </summary>publicreadonlydoubleSkewness;/// <summary>/// Standard deviation value/// </summary>publicreadonlydoubleStandardDeviation;/// <summary>/// Standard error value/// </summary>publicreadonlydoubleStandardError;/// <summary>/// Initializes a new instance of the <see cref="BenchmarkDiscreteStats"/> struct./// </summary>/// <param name="n">Number of samples</param>/// <param name="max">Max value</param>/// <param name="min">Min value</param>/// <param name="mean">Mean value</param>/// <param name="median">Median value</param>/// <param name="standardDeviation">Standard deviation value</param>/// <param name="standardError">Standard error value</param>/// <param name="kurtosis">Kurtosis value</param>/// <param name="skewness">Skewness value</param>/// <param name="p99">99 percentile value</param>/// <param name="p95">95 percentile value</param>/// <param name="p90">90 percentile value</param>publicBenchmarkDiscreteStats(intn,doublemax,doublemin,doublemean,doublemedian,doublestandardDeviation,doublestandardError,doublekurtosis,doubleskewness,doublep99,doublep95,doublep90){}/// <summary>/// Get benchmark discrete stats from an array of doubles/// </summary>/// <param name="values">Array of doubles</param>/// <returns>Benchmark discrete stats instance</returns>publicstaticDatadog.Trace.Ci.BenchmarkDiscreteStatsGetFrom(double[]values){}}/// <summary>/// Benchmark host info/// </summary>publicstructBenchmarkHostInfo{/// <summary>/// Chronometer Frequency/// </summary>publicdouble?ChronometerFrequencyHertz;/// <summary>/// Chronometer resolution/// </summary>publicdouble?ChronometerResolution;/// <summary>/// Logical core count/// </summary>publicint?LogicalCoreCount;/// <summary>/// OS Version/// </summary>publicstring?OsVersion;/// <summary>/// Physical core count/// </summary>publicint?PhysicalCoreCount;/// <summary>/// Physical processor count/// </summary>publicint?ProcessorCount;/// <summary>/// Processor max frequency hertz/// </summary>publicdouble?ProcessorMaxFrequencyHertz;/// <summary>/// Processor Name/// </summary>publicstring?ProcessorName;/// <summary>/// Runtime version/// </summary>publicstring?RuntimeVersion;}/// <summary>/// Benchmark job info/// </summary>publicstructBenchmarkJobInfo{/// <summary>/// Job description/// </summary>publicstring?Description;/// <summary>/// Job platform/// </summary>publicstring?Platform;/// <summary>/// Job runtime moniker/// </summary>publicstring?RuntimeMoniker;/// <summary>/// Job runtime name/// </summary>publicstring?RuntimeName;}/// <summary>/// Benchmark measure type/// </summary>publicenumBenchmarkMeasureType{/// <summary>/// Duration in nanoseconds/// </summary>Duration=0,/// <summary>/// Run time in nanoseconds/// </summary>RunTime=1,/// <summary>/// Mean heap allocations in bytes/// </summary>MeanHeapAllocations=2,/// <summary>/// Total heap allocations in bytes/// </summary>TotalHeapAllocations=3,/// <summary>/// Application launch in nanoseconds/// </summary>ApplicationLaunch=4,/// <summary>/// Garbage collector gen0 count/// </summary>GarbageCollectorGen0=5,/// <summary>/// Garbage collector gen1 count/// </summary>GarbageCollectorGen1=6,/// <summary>/// Garbage collector gen2 count/// </summary>GarbageCollectorGen2=7,/// <summary>/// Memory total operations count/// </summary>MemoryTotalOperations=8,}}
Code example
The following code represents a simple usage of the API: