Cette page n'est pas encore disponible en français, sa traduction est en cours. Si vous avez des questions ou des retours sur notre projet de traduction actuel, n'hésitez pas à nous contacter.
CI Visibility is not available in the selected site () at this time.
Compatibility
Supported GitLab versions:
GitLab.com (SaaS)
GitLab >= 14.1 (self-hosted)
GitLab >= 13.7.0 (self-hosted) by enabling the datadog_ci_integration feature flag
Partial pipelines: View partially retried and downstream pipeline executions
Manual steps: View manually triggered pipelines
Queue time: View amount of time pipeline jobs wait in the queue before processing
Infrastructure metric correlation: Correlate pipelines to infrastructure host metrics for self-hosted GitLab runners
Custom pre-defined tags: Configure custom tags to all generated pipeline, stages, and job spans
Custom tags and metrics at runtime: Configure custom tags and metrics at runtime
Parameters: Set custom env or service parameters
Pipeline failure reasons: Identify pipeline failure reasons from error messages
Configure the Datadog integration
Configure the integration on a project or group by going to Settings > Integrations > Datadog for each project or group you want to instrument.
Configure the integration on a project or group by going to Settings > Integrations > Datadog for each project or group you want to instrument.
You can also activate the integration at the GitLab instance level, by going to Admin > Settings > Integrations > Datadog.
Enable the datadog_ci_integrationfeature flag to activate the integration. Run one of the following commands, which use GitLab’s Rails Runner, depending on your installation type:
Then, configure the integration on a project by going to Settings > Integrations > Datadog for each project you want to instrument.
Note: Due to a bug in early versions of GitLab, the Datadog integration cannot be enabled at group or instance level on GitLab versions < 14.1, even if the option is available on GitLab's UI
For older versions of GitLab, you can use webhooks to send pipeline data to Datadog.
Note: Direct support with webhooks is not under development. Unexpected issues could happen. Datadog recommends that you update GitLab instead.
Go to Settings > Webhooks in your repository (or GitLab instance settings), and add a new webhook:
URL: https://webhook-intake./api/v2/webhook/?dd-api-key=<API_KEY> where <API_KEY> is your Datadog API key.
Secret Token: leave blank
Trigger: Select Job events and Pipeline events.
To set custom env or service parameters, add more query parameters in the webhooks URL: &env=<YOUR_ENV>&service=<YOUR_SERVICE_NAME>
Set custom tags
To set custom tags to all the pipeline and job spans generated by the integration, add to the URL a URL-encoded query parameter tags, with key:value pairs separated by commas. If a key:value pair contains any commas, surround it with quotes. For example, to add key1:value1,"key2: value with , comma",key3:value3, the following string would need to be appended to the Webhook URL:
Specifies which Datadog site to send data to. Default: datadoghq.com Selected site:
API URL (optional)
Allows overriding the API URL used for sending data directly, only used in advanced scenarios. Default: (empty, no override)
API key
Specifies which API key to use when sending data. You can generate one in the APIs tab of the Integrations section on Datadog.
Service (optional)
Specifies which service name to attach to each span generated by the integration. Use this to differentiate between GitLab instances. Default: gitlab-ci
Env (optional)
Specifies which environment (env tag) to attach to each span generated by the integration. Use this to differentiate between groups of GitLab instances (for example: staging or production). Default: none
Tags (optional)
Specifies any custom tags to attach to each span generated by the integration. Provide one tag per line in the format: key:value. Default: (empty, no additional tags) Note: Available only in GitLab.com and GitLab >= 14.8 self-hosted.
You can test the integration with the Test settings button (only available when configuring the integration on a project). After it’s successful, click Save changes to finish the integration set up.
Integrate with Datadog Teams
To display and filter the teams associated with your pipelines, add team:<your-team> as a custom tag. The custom tag name must match your Datadog Teams team handle exactly.
Visualize pipeline data in Datadog
After the integration is successfully configured, the Pipelines and Pipeline Executions pages populate with data after the pipelines finish.
Note: The Pipelines page shows data for only the default branch of each repository.
Partial and downstream pipelines
In the Pipeline Executions page, you can use the filters below in the search bar:
Downstream Pipeline
Possible values: true, false
Manually Triggered
Possible values: true, false
Partial Pipeline
Possible values: retry, paused, resumed
These filters can also be applied through the facet panel on the left hand side of the page.
Correlate infrastructure metrics to jobs
If you are using self-hosted GitLab runners, you can correlate jobs with the infrastructure that is running them.
For this feature to work, the GitLab runner must have a tag of the form host:<hostname>. Tags can be added while
registering a new runner. For existing runners, add tags by updating the runner’s config.toml. Or add tags
through the UI by going to Settings > CI/CD > Runners and editing the appropriate runner.
After these steps, CI Visibility adds the hostname to each job. To see the metrics, click on a job span in the trace
view. In the drawer, a new tab named Infrastructure appears which contains the host metrics.
Note: Infrastructure metrics are not supported with autoscaler GitLab runner executors.
View error messages for pipeline failures
Error messages are supported for GitLab versions 15.2.0 and above.
For failed GitLab pipeline executions, each error under the Errors tab within a specific pipeline execution displays a message associated with the error type from GitLab.
See the table below for the message and domain correlated with each error type. Any unlisted error type will lead to the error message of Job failed and error domain of unknown.
Error Type
Error Message
Error Domain
unknown_failure
Failed due to unknown reason
unknown
config_error
Failed due to error on CI/CD configuration file
user
external_validation_failure
Failed due to external pipeline validation
unknown
user_not_verified
The pipeline failed due to the user not being verified
user
activity_limit_exceeded
The pipeline activity limit was exceeded
provider
size_limit_exceeded
The pipeline size limit was exceeded
provider
job_activity_limit_exceeded
The pipeline job activity limit was exceeded
provider
deployments_limit_exceeded
The pipeline deployments limit was exceeded
provider
project_deleted
The project associated with this pipeline was deleted
provider
api_failure
API Failure
provider
stuck_or_timeout_failure
Pipeline is stuck or timed out
unknown
runner_system_failure
Failed due to runner system failure
provider
missing_dependency_failure
Failed due to missing dependency
unknown
runner_unsupported
Failed due to unsupported runner
provider
stale_schedule
Failed due to stale schedule
provider
job_execution_timeout
Failed due to job timeout
unknown
archived_failure
Archived failure
provider
unmet_prerequisites
Failed due to unmet prerequisite
unknown
scheduler_failure
Failed due to schedule failure
provider
data_integrity_failure
Failed due to data integrity
provider
forward_deployment_failure
Deployment failure
unknown
user_blocked
Blocked by user
user
ci_quota_exceeded
CI quota exceeded
provider
pipeline_loop_detected
Pipeline loop detected
user
builds_disabled
Build disabled
user
deployment_rejected
Deployment rejected
user
protected_environment_failure
Environment failure
provider
secrets_provider_not_found
Secret provider not found
user
reached_max_descendant_pipelines_depth
Reached max descendant pipelines
user
ip_restriction_failure
IP restriction failure
provider
Enable job log collection
The following GitLab versions support collecting job logs:
GitLab >= 14.8 (self-hosted) by enabling the datadog_integration_logs_collection feature flag
Note: Logs are billed separately from CI Visibility.
Job logs are collected in Log Management and are automatically correlated with the GitLab pipeline in CI Visibility. Log files larger than one GiB are truncated.
For more information about processing job logs collected from the GitLab integration, see the Processors documentation.
To enable collection of job logs:
Click the Enable job logs collection checkbox in the GitLab integration Settings > Integrations > Datadog.
Click Save changes.
Note: Datadog downloads log files directly from your GitLab logs object storage with temporary pre-signed URLs.
The storage must not have network restrictions, such as an IP range allowlist.
Click Enable job logs collection checkbox in the GitLab integration under Settings > Integrations > Datadog.
Click Save changes.
Enable the datadog_integration_logs_collectionfeature flag in your GitLab. This allows you to see the Enable job logs collection checkbox in the GitLab integration under Settings > Integrations > Datadog.
Click Enable job logs collection.
Click Save changes.
Further reading
Documentation, liens et articles supplémentaires utiles: