이 페이지는 아직 한국어로 제공되지 않습니다. 번역 작업 중입니다.
현재 번역 프로젝트에 대한 질문이나 피드백이 있으신 경우
언제든지 연락주시기 바랍니다.Overview
Scorecards help your team measure and continuously improve the health and performance of your software. As Platform Engineers, you can create Scorecards to automatically evaluate entities in your Software Catalog against defined criteria to surface areas that need attention.
You have full control over how scorecards are defined. In addition to the three sets of core Scorecards that the Datadog platform provides around Production Readiness, Observability Best Practices, and Documentation & Ownership, you can customize default rules or create new ones to match your team’s priorities and reflect your own operational standards. This flexibility lets you tailor Scorecards to your organization’s engineering culture and maturity.
Datadog evaluates the default Scorecards every 24 hours for all registered entities in the Software Catalog against a set of pass-fail criteria. You can turn off these default evaluations any time. You can configure the data input, evaluation criteria, and evaluation cadence for any customized rules using the Scorecards API or Datadog Workflow Automation.
Datadog can summarize Scorecard results into automated reports and deliver them directly through Slack, helping your team stay aligned, track improvements, and efficiently address gaps.
Get started
Set up Scorecards and explore how they can help your team:
Further reading