The Tests page, under the CI menu in Datadog, provides a test-first view into your CI health by showing you important metrics and results from your tests. It can help you drill down into performance problems and test failures that concern you primarily because you work on the related code (and less because you maintain the pipelines they are run in).
The Tests page shows the Branches view and the Test Services view.
The Branches view of the Tests page lists all branches from all Test Services that have reported test results. This tab is useful for individual developers to quickly see the status of tests that run on their code branches and troubleshoot test failures.
In this page, you can filter the list by name, test service, or commit SHA, or to show only your branches (branches that contain at least one commit authored by you) by enabling the My branches toggle and adding the email addresses you use in your Git configuration.
For each branch, the list shows test results for its latest commit: a consolidated number of tests broken down by status (which takes into account retries) and the number of new flaky tests introduced by the commit (a flaky test is defined as a test that both passes and fails on the same commit).
There’s also information about the wall time of the most recent test suite run, and a comparison to the average wall time of the default branch. Wall time is the real time elapsed while the test suite runs, which is less than the sum of all test times when tests are run concurrently. The comparison of your branch’s wall time to the default branch’s wall time can help you determine if your commit is introducing performance regressions to your test suite.
Hovering over the commit author avatar shows detailed information about the latest commit.
Click on the row to see test suite run details such as test results for the last commit on this branch (or you can switch branches), failing tests and the most common errors, slow tests, flaky tests, and a complete list of test runs over the time frame selected. You can filter this list of test runs by facet to get to the information you want to see most.
Click into one of the test runs to see the test trace as a flame graph or a span list. The Runs (n) list on the left lets you quickly access traces for each retry of the test for the same commit.
Click the CI provider link to drill down to the Resource, Service, or Analytics page for the test. You can also find complete tags information and links to related log events and network monitoring events.
A test service is a group of tests associated with, for example, a project or repo. It contains all the individual tests for your code, optionally organized into test suites (which are like folders for your tests). The Test Services view of the Tests page shows aggregated health metrics for the default branch of each test service. This view is useful for teams to understand the overall health of the service over time.
The Test Services view shows the same information as the Branches view, but applied to the default branch, and sorted by most recent. It compares the current wall time with the average default branch wall time, to give you an indication of how your test suite performance is trending over time.
Click on a row to see the analytics for tests run on the default branch, similar to drilling down for test run details from the Branches view.
On the Test Runs page, you can see the list of all runs over the selected time frame, filter by facet, and drill down into individual test run details. Each test run is reported as a trace, which in the case of integration tests includes calls made to datastores or third party services using regular APM instrumentation.
Click into a particular test run to see the flame graph or span list for a test, for each time it’s been run, similar to clicking into a test run from the Tests page.
You can also interactively plot graphs and top lists using the Analytics tab.
Spans generated by third party services that are instrumented with APM and that are involved in integration tests appear in APM. You can filter spans that were generated as part of an integration test using the
Origin Service facet and selecting the test service name used by the integration test.