- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
",t};e.buildCustomizationMenuUi=t;function n(e){let t='
",t}function s(e){let n=e.filter.currentValue||e.filter.defaultValue,t='${e.filter.label}
`,e.filter.options.forEach(s=>{let o=s.id===n;t+=``}),t+="${e.filter.label}
`,t+=`This document walks through how to send Azure Event Hubs logs to Observability Pipelines using the Kafka source. The setup steps include setting up Azure Event Hubs for the Kafka source:
After Azure Event Hubs has been set up, you set up a pipeline with the Kafka source to send Azure Event Hubs logs to Observability Pipelines.
westus
).Note: The Kafka endpoint is automatically enabled for standard and higher tiers.
datadog-topic
) and configure the settings (for example, 4 partitions and a 7-day retention time).DatadogKafkaPolicy
).datadog-topic
).Azure Event Hubs exposes a Kafka endpoint at NAMESPACE.servicebus.windows.net:9093
, which Observability Pipelines uses as the Kafka source.
myeventhubns
).<NAMESPACE>.servicebus.windows.net
(for example, myeventhubns.servicebus.windows.net
).:9093
to form the Bootstrap Servers value: <NAMESPACE>.servicebus.windows.net:9093
.myeventhubns
, the Bootstrap Servers is myeventhubns.servicebus.windows.net:9093
.Username: $$ConnectionString
Password: Endpoint=sb://<NAMESPACE>.servicebus.windows.net/;SharedAccessKeyName=<PolicyName>;SharedAccessKey=<Key>
Select your platform.
Navigate to Observability Pipelines.
Select the Kafka source.
datadog-consumer-group
).datadog-topic
or the topic you configured for your Event Hub earlier.values.yaml
file to use the certificate that works as part of the container image:initContainers:
- name: copy-config
image: gcr.io/datadoghq/observability-pipelines-worker:latest
imagePullPolicy: IfNotPresent
command: ['/bin/sh', '-c', 'mkdir -p /config-volume/observability-pipelines-worker/config/ && cp /etc/ssl/certs/ca-certificates.crt /config-volume/observability-pipelines-worker/config/ca-certificates.crt']
volumeMounts:
- name: config-volume
mountPath: /config-volume
extraVolumes:
- name: config-volume
emptyDir: {}
extraVolumeMounts:
- name: config-volume
mountPath: /config-volume
--set env[0].name=DD_OP_DATA_DIR,env[0].value='/config-volume/observability-pipelines-worker/'
/ca-certificates.crt
if you used the example above. Otherwise, enter the name of your certificate.Click Next: Select Destination.
After you set up your destinations and processors, click Next: Install.
Select your platform in the Choose your installation platform dropdown menu.
Enter the environment variables for your Kafka source:
<NAMESPACE>.servicebus.windows.net:9093
(for example, myeventhubns.servicebus.windows.net:9093
).$$$$ConnectionString
. Note: You must have $$$$
in front of ConnectionString
because $$$$
ends up being $$
when transposed into the environment.Endpoint=sb://<NAMESPACE>.servicebus.windows.net/;SharedAccessKeyName=<PolicyName>;SharedAccessKey=<Key>
.Enter the environment variables for your destinations, if applicable.
Follow the rest of the instructions on the page to install the Worker based on your platform.
Navigate to Observability Pipelines.
Select the Kafka source.
datadog-consumer-group
).datadog-topic
in the Topics field.sudo mkdir -p /var/lib/observability-pipelines-worker/config
sudo cp /etc/ssl/certs/ca-certificates.crt /var/lib/observability-pipelines-worker/config/
/ca-certificates.crt
.Click Next: Select Destination.
After you set up your destinations and processors, click Next: Install.
Select your platform in the Choose your installation platform dropdown menu.
Enter the environment variables for your Kafka source:
<NAMESPACE>.servicebus.windows.net:9093
(for example, myeventhubns.servicebus.windows.net:9093
).\$\$ConnectionString
. Note: You must escape the $
in front of ConnectionString
, otherwise the environment variable won’t be loaded."
). For example, "Endpoint=sb://<NAMESPACE>.servicebus.windows.net/;SharedAccessKeyName=<PolicyName>;SharedAccessKey=<Key>"
.If you run into issues after installing the Worker, check your Observability Pipelines environment file (/etc/default/observability-pipelines-worker
) to make sure the environment variables are correctly set:
DD_OP_SOURCE_KAFKA_SASL_USERNAME="$$ConnectionString"
DD_OP_SOURCE_KAFKA_BOOTSTRAP_SERVERS=<NAMESPACE>.servicebus.windows.net:9093
DD_OP_SOURCE_KAFKA_SASL_PASSWORD=<Endpoint=sb://<NAMESPACE>.servicebus.windows.net/;SharedAccessKeyName=<PolicyName>;SharedAccessKey=<Key>>
DD_OP_SOURCE_KAFKA_KEY_PASS=password
If you see the error Missing environment variable DD_OP_SOURCE_KAFKA_SASL_PASSWORD
and you are running the Worker in a VM, make sure that the variable is in quotes ("
) when you run the Worker install script. For example:
DD_OP_SOURCE_KAFKA_SASL_PASSWORD=`"Endpoint=sb://<NAMESPACE>.servicebus.windows.net/;SharedAccessKeyName=<PolicyName>;SharedAccessKey=<Key>"`