- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
In Observability Pipelines, a pipeline is a sequential path with three types of components: source, processors, and destinations. The Observability Pipeline source receives logs from your log source (for example, the Datadog Agent). The processors enrich and transform your data, and the destination is where your processed logs are sent. For some templates, your logs are sent to more than one destination. For example, if you use the Archive Logs template, your logs are sent to a cloud storage provider and another specified destination.
Set up your pipelines and its sources, processors, and destinations in the Observability Pipelines UI. The general setup steps are:
See Advanced Configurations for bootstrapping options and for details on setting up the Worker with Kubernetes.
After you have set up your pipeline, see Update Existing Pipelines if you want to make any changes to it.
Note: You cannot delete an active pipeline. You must stop all Workers for a pipeline before you can delete it.