- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
",t};e.buildCustomizationMenuUi=t;function n(e){let t='
",t}function s(e){let n=e.filter.currentValue||e.filter.defaultValue,t='${e.filter.label}
`,e.filter.options.forEach(s=>{let o=s.id===n;t+=``}),t+="${e.filter.label}
`,t+=`Use the Azure Storage destination to send logs to an Azure Storage bucket. If you want to send logs in Datadog-rehydratable format to Azure Storage for archiving and rehydration, you must configure Log Archives. If you want to send your logs directly to Azure Storage, without converting them to Datadog-rehydratable format, skip to Set up the destination for your pipeline.
This step is only required if you want to send logs to Azure Storage in Datadog-rehydratable format for archiving and rehydration, and you don’t already have a Datadog Log Archive configured for Observability Pipelines. If you already have a Datadog Log Archive configured or only want to send your logs directly to Azure Storage, skip to Set up the destination for your pipeline.
You need to have Datadog’s Azure integration installed to set up Datadog Log Archives.
Create an Azure storage account if you don’t already have one.
Note: Do not set immutability policies because the most recent data might need to be rewritten in rare cases (typically when there is a timeout).
observability_pipelines_read_only_archive
, assuming no logs going through the pipeline have that tag added.See the Log Archives documentation for additional information.
Set up the Azure Storage destination and its environment variables when you set up an Archive Logs pipeline. The information below is configured in the pipelines UI.
/
to act as a directory path; a trailing /
is not automatically added.Enter the Azure connection string you created earlier. The connection string gives the Worker access to your Azure Storage bucket.
To get the connection string:
A batch of events is flushed when one of these parameters is met. See event batching for more information.
Max Events | Max Bytes | Timeout (seconds) |
---|---|---|
None | 100,000,000 | 900 |