---
title: Set up Static Code Analysis (SAST)
description: >-
  Learn about Datadog Static Code Analysis to scan code for quality issues and
  security vulnerabilities before your code reaches production.
breadcrumbs: >-
  Docs > Datadog Security > Code Security > Static Code Analysis (SAST) > Set up
  Static Code Analysis (SAST)
---

# Set up Static Code Analysis (SAST)

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com, us2.ddog-gov.com

{% alert level="danger" %}
This product is not supported for your selected [Datadog site](https://docs.datadoghq.com/getting_started/site.md). ().
{% /alert %}

{% /callout %}

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com, us2.ddog-gov.com



{% alert level="warning" %}
Code Security is not available for the  site.
{% /alert %}


{% /callout %}

## Overview{% #overview %}

To set up Datadog SAST in-app, navigate to [**Security** > **Code Security**](https://app.datadoghq.com/security/configuration/code-security/setup).

## Select where to run Static Code Analysis scans{% #select-where-to-run-static-code-analysis-scans %}

### Scan with Datadog-hosted scanning{% #scan-with-datadog-hosted-scanning %}

You can run Datadog Static Code Analysis (SAST) scans directly on Datadog infrastructure. Supported repository types include:

- [GitHub](https://docs.datadoghq.com/security/code_security/static_analysis/setup.md?tab=github#select-your-source-code-management-provider) (excluding repositories that use [Git Large File Storage](https://docs.github.com/en/repositories/working-with-files/managing-large-files/about-git-large-file-storage))
- [GitLab.com and GitLab Self-Managed](https://docs.datadoghq.com/security/code_security/static_analysis/setup.md?tab=gitlab#select-your-source-code-management-provider)
- [Azure DevOps](https://docs.datadoghq.com/security/code_security/static_analysis/setup.md?tab=azuredevops#select-your-source-code-management-provider)

To get started, navigate to the [**Code Security** page](https://app.datadoghq.com/security/configuration/code-security/setup).

### Scan in CI pipelines{% #scan-in-ci-pipelines %}

Datadog Static Code Analysis runs in your CI pipelines using the [`datadog-ci` CLI](https://github.com/DataDog/datadog-ci).

First, configure your Datadog API and application keys. Add `DD_APP_KEY` and `DD_API_KEY` as secrets. Please ensure your Datadog application key has the `code_analysis_read` scope.

Next, run Static Code Analysis by following instructions for your chosen CI provider below.

- [GitHub Actions](https://docs.datadoghq.com/security/code_security/static_analysis/setup/github_actions.md)
- [Generic CI Providers](https://docs.datadoghq.com/security/code_security/static_analysis/setup/generic_ci_providers.md)

## Select your source code management provider{% #select-your-source-code-management-provider %}

Datadog Static Code Analysis supports all source code management providers, with native support for GitHub, GitLab, and Azure DevOps.

{% tab title="GitHub" %}
Configure a GitHub App with the [GitHub integration tile](https://docs.datadoghq.com/integrations/github.md#link-a-repository-in-your-organization-or-personal-account) and set up the [source code integration](https://docs.datadoghq.com/integrations/guide/source-code-integration.md) to enable inline code snippets and [pull request comments](https://docs.datadoghq.com/security/code_security/dev_tool_int/github_pull_requests.md).

When installing a GitHub App, the following permissions are required to enable certain features:

- `Content: Read`, which allows you to see code snippets displayed in Datadog
- `Pull Request: Read & Write`, which allows Datadog to add feedback for violations directly in your pull requests using [pull request comments](https://docs.datadoghq.com/security/code_security/dev_tool_int/github_pull_requests.md), as well as open pull requests to [fix vulnerabilities](https://docs.datadoghq.com/security/code_security/dev_tool_int.md)
- `Checks: Read & Write`, which allows you to create checks on SAST violations to block pull requests

{% /tab %}

{% tab title="GitLab" %}
See the [GitLab source code setup instructions](https://docs.datadoghq.com/integrations/gitlab-source-code.md#setup) to connect GitLab repositories to Datadog. Both GitLab.com and Self-Managed instances are supported.
{% /tab %}

{% tab title="Azure DevOps" %}
**Note:** Your Azure DevOps integrations must be connected to a Microsoft Entra tenant. Azure DevOps Server is **not** supported.

See the [Azure source code setup instructions](https://docs.datadoghq.com/integrations/azure-devops-source-code.md#setup) to connect Azure DevOps repositories to Datadog.
{% /tab %}

{% tab title="Other" %}
If you are using another source code management provider, configure Static Code Analysis to run in your CI pipelines using the `datadog-ci` CLI tool and upload the results to Datadog. You **must** run an analysis of your repository on the default branch before results can begin appearing on the **Code Security** page.
{% /tab %}

## Customize your configuration{% #customize-your-configuration %}

By default, Datadog Static Code Analysis (SAST) scans your repositories with [Datadog's default rulesets](https://docs.datadoghq.com/security/code_security/static_analysis/static_analysis_rules.md) for each programming language. You can customize which rulesets or rules run, along with other parameters, in Datadog or in a `code-security.datadog.yaml` file. For the full configuration reference, see [Static Code Analysis (SAST) Configuration](https://docs.datadoghq.com/security/code_security/static_analysis/configuration.md).

## Link findings to Datadog services and teams{% #link-findings-to-datadog-services-and-teams %}

Datadog associates code and library scan results with Datadog services and teams to automatically route findings to the appropriate owners. This enables service-level visibility, ownership-based workflows, and faster remediation.

To determine the service where a vulnerability belongs, Datadog evaluates several mapping mechanisms in the order listed in this section.

Each vulnerability is mapped with one method only: if a mapping mechanism succeeds for a particular finding, Datadog does not attempt the remaining mechanisms for that finding.

{% alert level="danger" %}
Using service definitions that include code locations in the Software Catalog is the only way to explicitly control how static findings are mapped to services. The additional mechanisms described below, such as Error Tracking usage patterns and naming-based inference, are not user-configurable and depend on existing data from other Datadog products. Consequently, these mechanisms might not provide consistent mappings for organizations not using these products.
{% /alert %}

### Mapping using the Software Catalog (recommended){% #mapping-using-the-software-catalog-recommended %}

Services in the Software Catalog identify their codebase content using the `codeLocations` field. This field is available in the **Software Catalog [schema version `v3`](https://docs.datadoghq.com/software_catalog/service_definitions/v3-0.md)** and allows a service to specify:

- a repository URL

```yaml
apiVersion: v3
kind: service
metadata:
  name: billing-service
  owner: billing-team
datadog:
  codeLocations:
    - repositoryURL: https://github.com/org/myrepo.git
```

- one or more code paths inside that repository

```yaml
apiVersion: v3
kind: service
metadata:
  name: billing-service
  owner: billing-team
datadog:
  codeLocations:
    - repositoryURL: https://github.com/org/myrepo.git
      paths:
        - path/to/service/code/**
```

If you want all the files in a repository to be associated with a service, you can use the glob `**` as follows:

```yaml
apiVersion: v3
kind: service
metadata:
  name: billing-service
  owner: billing-team
datadog:
  codeLocations:
    - repositoryURL: https://github.com/org/myrepo.git
      paths:
        - path/to/service/code/**
    - repositoryURL: https://github.com/org/billing-service.git
      paths:
        - "**"
```

The schema for this field is described in the [Software Catalog entity model](https://docs.datadoghq.com/internal_developer_portal/software_catalog/entity_model.md?tab=v30#codelocations).

Datadog goes through all Software Catalog definitions and checks whether the finding's file path matches. For a finding to be mapped to a service through `codeLocations`, it must contain a file path.

Some findings might not contain a file path. In those cases, Datadog cannot evaluate `codeLocations` for that finding, and this mechanism is skipped.

{% alert level="danger" %}
Services defined with a Software Catalog schema v2.x do not support codeLocations. Existing definitions can be upgraded to the v3 schema in the Software Catalog. After migration is completed, changes might take up to 24 hours to apply to findings. If you are unable to upgrade to v3, Datadog falls back to alternative linking techniques (described below). These rely on less precise heuristics, so accuracy might vary depending on the Code Security product and your use of other Datadog features.
{% /alert %}

#### Example (v3 schema){% #example-v3-schema %}

```yaml
apiVersion: v3
kind: service
metadata:
  name: billing-service
  owner: billing-team
datadog:
  codeLocations:
    - repositoryURL: https://github.com/org/myrepo.git
      paths:
        - path/to/service/code/**
    - repositoryURL: https://github.com/org/billing-service.git
      paths:
        - "**"
```

#### SAST finding{% #sast-finding %}

If a vulnerability appeared in `github.com/org/myrepo` at `/src/billing/models/payment.py`, then using the `codeLocations` for `billing-service` Datadog would add `billing-service` as an owning service. If your service defines an `owner` (see above), then Datadog links that team to the finding too. In this case, the finding would be linked to the `billing-team`.

#### SCA finding{% #sca-finding %}

If a library was declared in `github.com/org/myrepo` at `/go.mod`, then Datadog would not match it to `billing-service`.

Instead, if it was declared in `github.com/org/billing-service` at `/go.mod`, then Datadog would match it to `billing-service` due to the `"**"` catch-all glob. Consequently, Datadog would link the finding to the `billing-team`.

{% alert level="info" %}
Datadog attempts to map a single finding to as many services as possible. If no matches are found, Datadog continues onto the next linking method.
{% /alert %}

### When the Software Catalog cannot determine the service{% #when-the-software-catalog-cannot-determine-the-service %}

If the Software Catalog does not provide a match, either because the finding's file path does not match any `codeLocations`, or because the service uses the v2.x schema, Datadog evaluates whether Error Tracking can identify the service associated with the code. Datadog uses only the last 30 days of Error Tracking data due to product [data-retention limits](https://docs.datadoghq.com/data_security/data_retention_periods.md).

When Error Tracking processes stack traces, the traces often include file paths. For example, if an error occurs in: `/foo/bar/baz.py`, Datadog inspects the directory: `/foo/bar`. Datadog then checks whether the finding's file path resides under that directory.

**If the finding file is under the same directory:**

- Datadog treats this as a strong indication that the vulnerability belongs to the same service.
- The finding inherits the *service* and *team* associated with that error in Error Tracking.

If this mapping succeeds, Datadog stops here.

### Service inference from file paths or repository names{% #service-inference-from-file-paths-or-repository-names %}

When neither of the above strategies can determine the service, Datadog inspects naming patterns in the repository and file paths.

Datadog evaluates whether:

- The file path contains identifiers matching a known service.
- The repository name corresponds to a service name.

When using the finding's file path, Datadog performs a reverse search on each path segment until it finds a matching service or exhausts all options.

For example, if a finding occurs in `github.com/org/checkout-service` at `/foo/bar/baz/main.go`, Datadog takes the last path segment, `main`, and sees if any Software Catalog service uses that name. If there is a match, the finding is attributed to that service. If not, the process continues with `baz`, then `bar`, and so on.

When all options have been tried, Datadog checks whether the repository name, `checkout-service`, matches a Software Catalog service name. If no match is found, Datadog is unsuccessful at linking your finding using Software Catalog.

This mechanism ensures that findings receive meaningful service attribution when no explicit metadata exists.

### Link findings to teams through Code Owners{% #link-findings-to-teams-through-code-owners %}

If Datadog is able to link your finding to a service using the above strategies, then the team that owns that service (if defined) is associated with that finding automatically.

Regardless of whether Datadog successfully links a finding to a service (and a Datadog team), Datadog uses the `CODEOWNERS` information from your finding's repository to link Datadog and GitHub teams to your findings.

{% alert level="info" %}
You must accurately map your Git provider teams to your [Datadog Teams](https://docs.datadoghq.com/account_management/teams.md) for team attribution to function properly.
{% /alert %}

## Diff-aware scanning{% #diff-aware-scanning %}

Diff-aware scanning enables Datadog's static analyzer to only scan the files modified by a commit in a feature branch. It accelerates scan time significantly by not having the analysis run on every file in the repository for every scan. To enable diff-aware scanning in your CI pipeline, follow these steps:

1. Make sure your `DD_APP_KEY`, `DD_SITE` and `DD_API_KEY` variables are set in your CI pipeline.
1. Add a call to `datadog-ci git-metadata upload` before invoking the static analyzer. This command ensures that Git metadata is available to the Datadog backend. Git metadata is required to calculate the number of files to analyze.
1. Ensure that the datadog-static-analyzer is invoked with the flag `--diff-aware`.

Example of commands sequence (these commands must be invoked in your Git repository):

```bash
datadog-ci git-metadata upload

datadog-static-analyzer -i /path/to/directory -g -o sarif.json -f sarif –-diff-aware <...other-options...>
```

**Note:** When a diff-aware scan cannot be completed, the entire directory is scanned.

## Upload third-party static analysis results to Datadog{% #upload-third-party-static-analysis-results-to-datadog %}

{% alert level="info" %}
SARIF importing has been tested for Snyk, CodeQL, Semgrep, Gitleaks, and Sysdig. Reach out to [Datadog Support](https://docs.datadoghq.com/help) if you experience any issues with other SARIF-compliant tools.
{% /alert %}

You can send results from third-party static analysis tools to Datadog, provided they are in the interoperable [Static Analysis Results Interchange Format (SARIF) Format](https://www.oasis-open.org/committees/tc_home.php?wg_abbrev=sarif). Node.js version 14 or later is required.

To upload a SARIF report:

1. Ensure the [`DD_API_KEY` and `DD_APP_KEY` variables are defined](https://docs.datadoghq.com/account_management/api-app-keys.md).

1. Optionally, set a [`DD_SITE` variable](https://docs.datadoghq.com/getting_started/site.md) (this defaults to `datadoghq.com`).

1. Install the `datadog-ci` utility:

   ```bash
   npm install -g @datadog/datadog-ci
   ```

1. Run the third-party static analysis tool on your code and output the results in the SARIF format.

1. Upload the results to Datadog:

   ```bash
   datadog-ci sarif upload $OUTPUT_LOCATION
   ```

## SARIF Support Guidelines{% #sarif-support-guidelines %}

Datadog supports ingestion of third-party SARIF files that are compliant with [the 2.1.0 SARIF schema](https://docs.oasis-open.org/sarif/sarif/v2.1.0/sarif-v2.1.0.html). The SARIF schema is used differently by static analyzer tools. If you want to send third-party SARIF files to Datadog, please ensure they comply with the following details:

- The violation location is specified through the `physicalLocation` object of a result.
  - The `artifactLocation` and it's `uri` **must be relative** to the repository root.
  - The `region` object is the part of the code highlighted in the Datadog UI.
- The `partialFingerprints` is used to uniquely identify a finding across a repository.
- `properties` and `tags` adds more information:
  - The tag `DATADOG_CATEGORY` specifies the category of the finding. Acceptable values are `SECURITY`, `PERFORMANCE`, `CODE_STYLE`, `BEST_PRACTICES`, `ERROR_PRONE`.
  - The violations annotated with the category `SECURITY` are surfaced in the Vulnerabilities explorer and the Security tab of the repository view.
- The `tool` section must have a valid `driver` section with a `name` and `version` attributes.

For example, here's an example of a SARIF file processed by Datadog:

```json

{
    "runs": [
        {
            "results": [
                {
                    "level": "error",
                    "locations": [
                        {
                            "physicalLocation": {
                                "artifactLocation": {
                                    "uri": "missing_timeout.py"
                                },
                                "region": {
                                    "endColumn": 76,
                                    "endLine": 6,
                                    "startColumn": 25,
                                    "startLine": 6
                                }
                            }
                        }
                    ],
                    "message": {
                        "text": "timeout not defined"
                    },
                    "partialFingerprints": {
                        "DATADOG_FINGERPRINT": "b45eb11285f5e2ae08598cb8e5903c0ad2b3d68eaa864f3a6f17eb4a3b4a25da"
                    },
                    "properties": {
                        "tags": [
                            "DATADOG_CATEGORY:SECURITY",
                            "CWE:1088"
                        ]
                    },
                    "ruleId": "python-security/requests-timeout",
                    "ruleIndex": 0
                }
            ],
            "tool": {
                "driver": {
                    "informationUri": "https://www.datadoghq.com",
                    "name": "<tool-name>",
                    "rules": [
                        {
                            "fullDescription": {
                                "text": "Access to remote resources should always use a timeout and appropriately handle the timeout and recovery. When using `requests.get`, `requests.put`, `requests.patch`, etc. - we should always use a `timeout` as an argument.\n\n#### Learn More\n\n - [CWE-1088 - Synchronous Access of Remote Resource without Timeout](https://cwe.mitre.org/data/definitions/1088.html)\n - [Python Best Practices: always use a timeout with the requests library](https://www.codiga.io/blog/python-requests-timeout/)"
                            },
                            "helpUri": "https://link/to/documentation",
                            "id": "python-security/requests-timeout",
                            "properties": {
                                "tags": [
                                    "CWE:1088"
                                ]
                            },
                            "shortDescription": {
                                "text": "no timeout was given on call to external resource"
                            }
                        }
                    ],
                    "version": "<tool-version>"
                }
            }
        }
    ],
    "version": "2.1.0"
}
```

## SARIF to CVSS severity mapping{% #sarif-to-cvss-severity-mapping %}

The [SARIF format](https://docs.oasis-open.org/sarif/sarif/v2.1.0/sarif-v2.1.0.html) defines four severities: none, note, warning, and error. However, Datadog reports violations and vulnerabilities severity using the [Common Vulnerability Scoring System](https://www.first.org/cvss/) (CVSS), which defined five severities: critical, high, medium, low and none.

When ingesting SARIF files, Datadog maps SARIF severities into CVSS severities using the mapping rules below.

| SARIF severity | CVSS severity |
| -------------- | ------------- |
| Error          | Critical      |
| Warning        | High          |
| Note           | Medium        |
| None           | Low           |

## Data Retention{% #data-retention %}

Datadog stores findings in accordance with our [Data Rentention Periods](https://docs.datadoghq.com/data_security/data_retention_periods.md). Datadog does not store or retain customer source code.
