이 페이지는 아직 영어로 제공되지 않습니다. 번역 작업 중입니다.
현재 번역 프로젝트에 대한 질문이나 피드백이 있으신 경우 언제든지 연락주시기 바랍니다.

Overview

Datadog Software Composition Analysis (SCA) scans your repositories for open-source libraries and detects known security vulnerabilities before you ship to production.

Supported languages: C#, Go, Java, JavaScript, PHP, Python, Ruby

To get started:

  1. Open Code Security settings.
  2. In Activate scanning for your repositories, click Manage Repositories.
  3. Choose where to run SCA scans (Datadog-hosted or CI pipelines).
  4. Follow the setup instructions for your source code provider.

The sections below cover the different ways to configure SCA for your repositories.

Select where to run static SCA scans

You can run SCA scans in two ways:

  • Datadog-hosted: For GitHub repositories (except those using Git Large File Storage).
  • CI Pipelines: For other providers (GitHub, GitLab, Azure DevOps).

Analyze code directly in Datadog

For GitHub repositories, you can run Datadog SCA scans directly on Datadog infrastructure. To get started, navigate to Code Security settings.

Datadog-hosted SCA does not support repositories using Git Large File Storage.
Instead, use CI Pipelines for these repositories.

Analyze code in your CI Pipelines

Run SCA by following instructions for your chosen CI provider below. Datadog SCA offers native support for:

You must scan your default branch at least once before results appear in Datadog Code Security.

Run SCA scans in your CI Pipelines

Datadog SCA scans libraries in the following languages and requires a lockfile to report them:

LanguagePackage ManagerLockfile
C#.NETpackages.lock.json
Gomodgo.mod
JVMGradlegradle.lockfile
JVMMavenpom.xml
Node.jsnpmpackage-lock.json
Node.jspnpmpnpm-lock.yaml
Node.jsyarnyarn.lock
PHPcomposercomposer.lock
Pythonpiprequirements.txt, Pipfile.lock
Pythonpoetrypoetry.lock
RubybundlerGemfile.lock

Select your source code management provider

Datadog SCA supports all source code management providers, with native support for GitHub, GitLab, and Azure DevOps.

If GitHub is your source code management provider, you must configure a GitHub App using the GitHub integration tile and set up the source code integration to see inline code snippets and enable pull request comments.

When installing a GitHub App, the following permissions are required to enable certain features:

  • Content: Read, which allows you to see code snippets displayed in Datadog
  • Pull Request: Read & Write, which allows Datadog to add feedback for violations directly in your pull requests using pull request comments.
  • Checks: Read & Write, which allows you to create checks on SAST violations to block pull requests
Repositories from Azure DevOps are supported in closed Preview. Your Azure DevOps organizations must be connected to a Microsoft Entra tenant. Join the Preview.

If Azure DevOps is your source code management provider, before you can begin installation, you must request access to the closed Preview using the form above. After being granted access, follow the instructions below to complete the setup process.

Note: Azure DevOps Server is not supported.

Create and register a Microsoft Entra app

If you are an admin in your Azure portal, you can configure Entra apps to connect your tenant to Datadog.

  1. Navigate to Code Security setup.
  2. In Activate scanning for your repositories, click Manage Repositories.
  3. Select CI Pipelines.
  4. Select the scan types you want to use.
  5. Select Azure DevOps as your source code management provider.
  6. If this is your first time connecting an Azure DevOps organization to Datadog, click Connect Azure DevOps Account.
  7. When connecting a Microsoft Entra tenant for the first time you will need to go to your Azure Portal to register a new application. During this creation process, ensure the following:
    1. You select Accounts in this organizational directory only (Datadog, Inc. only - Single tenant) as the account type.
    2. Set the redirect URI to Web and paste the URI given to you in the instructions.
  8. Copy the values for Application (client) ID and Directory (tenant) ID and paste them into Datadog.
  9. In the Azure Portal for your app registration, navigate to Manage > Certificates & secrets and switch to Client secrets.
  10. Click New client secret and create a secret with your desired description and expiration values.
  11. Copy and paste the string in the Value column for your new secret, paste it into Datadog, and click Create Configuration to complete connecting your Entra tenant to Datadog.
  12. Add one or more Azure DevOps organizations by pasting the organization slug into Datadog and then adding your Service Principal as a user by going to Organization settings > Users > Add users.
    1. Your Service Principal will need the Basic access level and at least the Project Contributor security group.
  13. Click Submit Organization.

Configure project service hooks

To enable all Code Security features in Azure DevOps, you’ll need to use a Datadog API key to configure service hooks for your projects.

First, set your environment variables (note: the Datadog UI will fill these values out for you):

export AZURE_DEVOPS_TOKEN="..."                 # Client Secret Value
export DD_API_KEY="..."                         # Datadog API Key

Then, replace the placeholders in the script below with your Datadog Site and Azure DevOps organization name to configure the necessary service hooks on your organization’s projects:

curl https://raw.githubusercontent.com/DataDog/azdevops-sci-hooks/refs/heads/main/setup-hooks.py > setup-hooks.py && chmod a+x ./setup-hooks.py
./setup-hooks.py --dd-site="<dd-site>" --az-devops-org="<org-name>"

Click here to see our CLI that automates this process.

Repositories from GitLab instances are supported in closed Preview. Join the Preview.

If GitLab is your source code management provider, before you can begin installation, you must request access to the closed Preview using the form above. After being granted access, follow these instructions to complete the setup process.

If you are using another source code management provider, configure SCA to run in your CI pipelines using the datadog-ci CLI tool and upload the results to Datadog.

Authentication

To upload results to Datadog, you must be authenticated. To ensure you’re authenticated, configure the following environment variables:

NameDescriptionRequiredDefault
DD_API_KEYYour Datadog API key. This key is created by your Datadog organization and should be stored as a secret.Yes
DD_APP_KEYYour Datadog application key. This key, created by your Datadog organization, should include the code_analysis_read scope and be stored as a secret.Yes
DD_SITEThe Datadog site to send information to. Your Datadog site is .Nodatadoghq.com

Running options

There are two ways to run SCA scans from within your CI Pipelines:

Run Via Pipelines Integration

You can run SCA scans automatically as part of your CI/CD workflows using built-in integrations for popular CI providers.

GitHub Actions

SCA can run as a job in your GitHub Actions workflows. The action provided below invokes Datadog’s recommended SBOM tool, Datadog SBOM Generator, on your codebase and uploads the results into Datadog.

Add the following code snippet in .github/workflows/datadog-sca.yml.

Make sure to replace the dd_site attribute with the Datadog site you are using.

datadog-sca.yml

on: [push]

name: Datadog Software Composition Analysis

jobs:
  software-composition-analysis:
    runs-on: ubuntu-latest
    name: Datadog SBOM Generation and Upload
    steps:
    - name: Checkout
      uses: actions/checkout@v3
    - name: Check imported libraries are secure and compliant
      id: datadog-software-composition-analysis
      uses: DataDog/datadog-sca-github-action@main
      with:
        dd_api_key: ${{ secrets.DD_API_KEY }}
        dd_app_key: ${{ secrets.DD_APP_KEY }}
        dd_site: "datadoghq.com"

Related GitHub Actions

Datadog Static Code Analysis (SAST) analyzes your first-party code. Static Code Analysis can be set up using the datadog-static-analyzer-github-action GitHub action.

Azure DevOps Pipelines

To add a new pipeline in Azure DevOps, go to Pipelines > New Pipeline, select your repository, and then create/select a pipeline.

Add the following content to your Azure DevOps pipeline YAML file:

datadog-sca.yml

trigger:
  branches:
    include:
      # Optionally specify a specific branch to trigger on when merging
      - "*"

pr:
  branches:
    include:
      - "*"

variables:
  - group: "Datadog"


jobs:
  - job: DatadogSoftwareCompositionAnalysis
    displayName: "Datadog Software Composition Analysis"
    steps:
      - script: |
          npm install -g @datadog/datadog-ci
          export DATADOG_OSV_SCANNER_URL="https://github.com/DataDog/datadog-sbom-generator/releases/latest/download/datadog-sbom-generator_linux_amd64.zip"
          mkdir -p /tmp/datadog-sbom-generator
          curl -L -o /tmp/datadog-sbom-generator/datadog-sbom-generator.zip $DATADOG_OSV_SCANNER_URL
          unzip /tmp/datadog-sbom-generator/datadog-sbom-generator.zip -d /tmp/datadog-sbom-generator
          chmod 755 /tmp/datadog-sbom-generator/datadog-sbom-generator
          /tmp/datadog-sbom-generator/datadog-sbom-generator scan --output=/tmp/sbom.json .
          datadog-ci sbom upload /tmp/sbom.json          
        env:
          DD_APP_KEY: $(DD_APP_KEY)
          DD_API_KEY: $(DD_API_KEY)
          DD_SITE: datadoghq.com

For all other providers, use the customizable script in the section below to run SCA scans and upload results to Datadog.

Run Via Customizable Script

If you use a different CI provider or want more control, you can run SCA scans using a customizable script. This approach lets you manually install and run the scanner, then upload results to Datadog from any environment.

For non-GitHub repositories, run your first scan on the default branch.
If your branch name is custom (not master, main, default, stable, source, prod, or develop), upload once and set the default branch in Repository Settings.

Prerequisites:

  • Unzip
  • Node.js 14 or later
# Set the Datadog site to send information to
export DD_SITE=""

# Install dependencies
npm install -g @datadog/datadog-ci

# Download the latest Datadog SBOM Generator:
# https://github.com/DataDog/datadog-sbom-generator/releases
DATADOG_SBOM_GENERATOR_URL=https://github.com/DataDog/datadog-sbom-generator/releases/latest/download/datadog-sbom-generator_linux_amd64.zip

# Install Datadog SBOM Generator
mkdir /datadog-sbom-generator
curl -L -o /datadog-sbom-generator/datadog-sbom-generator.zip $DATADOG_SBOM_GENERATOR_URL
unzip /datadog-sbom-generator/datadog-sbom-generator.zip -d /datadog-sbom-generator
chmod 755 /datadog-sbom-generator/datadog-sbom-generator

# Run Datadog SBOM Generator to scan your dependencies
/datadog-sbom-generator/datadog-sbom-generator scan --output=/tmp/sbom.json /path/to/repository

# Upload results to Datadog
datadog-ci sbom upload /tmp/sbom.json
This script uses the Linux x86_64 datadog-sbom-generator. For other systems, update the download URL. See all releases here.

Upload third-party SBOM to Datadog

Datadog recommends using the Datadog SBOM generator, but it is also possible to ingest a third-party SBOM.

You can upload SBOMs generated by other tools if they meet these requirements:

  • Valid CycloneDX 1.4, 1.5, or 1.6 JSON schema
  • All components have type library
  • All components have a valid purl attribute

Third-party SBOM files are uploaded to Datadog using the datadog-ci command.

You can use the following command to upload your third-party SBOM. Ensure the environment variables DD_API_KEY, DD_APP_KEY, and DD_SITE are set to your API key, APP key, and Datadog site, respectively.

datadog-ci sbom upload /path/to/third-party-sbom.json

Datadog associates static code and library scan results with relevant services by using the following mechanisms:

The schema version v3 and later of the Software Catalog allows you to add the mapping of your code location for your service. The codeLocations section specifies the location of the repository containing the code and its associated paths.

The paths attribute is a list of globs that should match paths in the repository.

entity.datadog.yaml

apiVersion: v3
kind: service
metadata:
  name: my-service
datadog:
  codeLocations:
    - repositoryURL: https://github.com/myorganization/myrepo.git
      paths:
        - path/to/service/code/**
If you want all the files in a repository to be associated with a service, you can use the glob `**/*` as follows:

entity.datadog.yaml

apiVersion: v3
kind: service
metadata:
  name: my-service
datadog:
  codeLocations:
    - repositoryURL: https://github.com/myorganization/myrepo.git
      paths:
        - "**/*"

Datadog detects file usage in additional products such as Error Tracking and associate files with the runtime service. For example, if a service called foo has a log entry or a stack trace containing a file with a path /modules/foo/bar.py, it associates files /modules/foo/bar.py to service foo.

Datadog detects service names in paths and repository names, and associates the file with the service if a match is found.

For a repository match, if there is a service called myservice and the repository URL is https://github.com/myorganization/myservice.git, then, it associates myservice to all files in the repository.

If no repository match is found, Datadog attempts to find a match in the path of the file. If there is a service named myservice, and the path is /path/to/myservice/foo.py, the file is associated with myservice because the service name is part of the path. If two services are present in the path, the service name closest to the filename is selected.

If one method succeeds (in order), no further mapping attempts are made.

Datadog automatically associates the team attached to a service when a violation or vulnerability is detected. For example, if the file domains/ecommerce/apps/myservice/foo.py is associated with myservice, then the team myservice will be associated to any violation detected in this file.

If no services or teams are found, Datadog uses the CODEOWNERS file in your repository. The CODEOWNERS file determines which team owns a file in your Git provider.

Note: You must accurately map your Git provider teams to your Datadog teams for this feature to function properly.

Further Reading


Other Code Security scanning for your repositories: