Enhancing Developer Workflows with Datadog

이 페이지는 아직 한국어로 제공되지 않으며 번역 작업 중입니다. 번역에 관한 질문이나 의견이 있으시면 언제든지 저희에게 연락해 주십시오.

Overview

Test Visibility is not available in the selected site () at this time.

Test Visibility integrates with other developer-oriented Datadog products as well as external partners such as GitHub to streamline developer workflows with features including being able to:

These features are available for all Test Visibility customers, and they do not require usage of the Datadog GitHub integration.

Create and open GitHub issues

With Test Visibility, you can create and open pre-filled GitHub issues with relevant context into your tests as well as deep links back to Datadog for more streamlined debugging workflows. Creating issues directly from Test Visibility can help you track and maintain accountability for test failures and flaky tests.

In-app entry points

You can create pre-filled GitHub issues from three areas within Test Visibility:

Commit Overview

The overview page for any commit can be discovered through a particular branch or from within any particular test.

Datadog GitHub issues preview

From the Commit Overview page, click on any row in the Failed Tests or New Flaky Tests tables and select Open issue in GitHub.

Branch Overview

From this page, click on any row in the Flaky Tests table and select Open issue in GitHub.

Datadog GitHub issues flaky tests table preview

Test Details View

From within a specific test run, click the Actions button and select Open issue in GitHub.

Datadog GitHub issues test detail view preview

You also have the option to copy an issue description in Markdown for pasting test details elsewhere. The Markdown description contains information such as the test execution link, service, branch, commit, author, and error.

Copy issue description in Markdown format for GitHub issues

Sample GitHub issue

Below is what a pre-filled GitHub issue might look like:

Pre-filled GitHub issue

Create Jira issues

With Case Management, you can create and open pre-filled Jira issues that contain relevant context related to your tests, as well as deep links back to Datadog for more streamlined debugging workflows. Creating issues directly from Test Visibility can help you track and maintain accountability for test failures and flaky tests.

When you update the status of a Jira issue, the status in Case Management updates and reflects the latest case status.

In-app entry points

After you have set up the Jira integration, you can create cases from three areas within Test Visibility:

You can manually create a Jira issue from a case in Case Management by clicking Shift + J.

Commit Overview

The overview page for any commit can be discovered through a particular branch or from within any particular test.

Create a Case Management issue in the Commit Overview page

From the Commit Overview page, click on any row in the Failed Tests or New Flaky Tests tables and select Create case.

Branch Overview

From this page, click on any row in the Flaky Tests table and select Create case.

Create a Case Management issue in the Flaky Tests list

Test Runs View

From within a specific test run, click the Actions button and select Create case.

Create a Case Management issue in the Test Runs side panel

For more information about configuring the Jira integration, see the Case Management documentation.

Open tests in GitHub and your IDE

In-app entry points

After detecting a failed and/or flaky test within Datadog, you have the option to open that test in GitHub or your IDE to fix it immediately.

Under the Error Message section in the Overview tab of a test run, click the View Code button to view the relevant lines of code for that test within Visual Studio Code, IntelliJ, or GitHub.

Open test in IDE

The order of options in this dropdown changes depending on the language your test was written in:

  • IntelliJ is prioritized for Java-based tests
  • Visual Studio Code is prioritized for JavaScript and Python-based tests

Installing IDE plugins

IDE plugins and extensions are required to view your test in your IDE.

  • If you do not have the VS Code extension installed, click View in VS Code to open the extension directly in VS Code for installation.
  • If you do not have the IntelliJ plugin installed, click View in IntelliJ to get the extension installation. Compatible Datadog versions can be found on the Plugin Versions page.

Test summaries in GitHub pull requests

Datadog integrates with GitHub to show test results summaries directly in your pull requests. The summary contains an overview of the tests executions, flakiness information, error messages for failed tests, performance regressions, and code coverage changes in pull request comments.

Datadog GitHub pull request comment preview

With this report, developers get instant feedback about their tests results, including the ability to debug any failed or flaky tests without leaving the pull request view.

This integration is only available for test services hosted on `github.com`.

Enable test summaries

You can enable test summaries in pull requests with the following steps:

  1. Install the GitHub integration:
    1. Navigate to the Configuration tab on the GitHub integration tile and click + Create GitHub App.
    2. Give the application read and write permissions for pull requests.
  2. Enable test summaries for one or more test services. It can be done from the Test Service Settings page or from the commit/branch page.

Test service settings page

  1. Navigate to the Test Service Settings page and search for the repository or test service.
  2. Click on the toggle under the GitHub Comments column for the desired service.
The Test Service Settings tab in Datadog with GitHub comments enabled for one test service

Commit or branch page

  1. Go to the test service commit or branch page where you want to enable GitHub comments.
  2. Click on the Settings icon and click View Test Service Settings.
  3. Select Enable GitHub Comments so that comments display on new pull requests. This change may take a few minutes.
Enable GitHub comments dropdown

Comments only appear on pull requests that were opened before the test run and that have run at least one test for an enabled test service.

Further reading