- 필수 기능
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- 디지털 경험
- 소프트웨어 제공
- 보안
- 로그 관리
- 관리
- 인프라스트럭처
- ci
- containers
- csm
- ndm
- otel_guides
- overview
- slos
- synthetics
- tests
- 워크플로
Use this guide to set up logging from your Azure subscriptions to Datadog.
Datadog recommends sending logs from Azure to Datadog with the Agent or DaemonSet. For some resources it may not be possible. In these cases, you can create a log forwarding pipeline using an Azure Event Hub to collect Azure Platform Logs. For resources that cannot stream Azure Platform Logs to an Event Hub, you can use the Blob Storage forwarding option.
All sites: All Datadog sites can use the steps on this page to send Azure logs to Datadog.
US3: If your organization is on the Datadog US3 site, you can use the Azure Native integration to simplify configuration for your Azure log forwarding. Datadog recommends using this method when possible. Configuration is done through the Datadog resource in Azure. This replaces the Azure Event Hub process for log forwarding. See the Azure Native Logging Guide for more information.
To get started, click the button below and fill in the form on Azure Portal. The Azure resources required to get activity logs streaming into your Datadog account will be deployed for you.
Alternatively, Datadog provides automated scripts you can use for sending Azure activity logs and Azure platform logs (including resource logs).
Follow these steps to run the script that creates and configures the Azure resources required to stream activity logs into your Datadog account. These resources include activity log diagnostic settings, Azure Functions, Event Hub namespaces, and Event Hubs.
Activity Logs Step 1
(New-Object System.Net.WebClient).DownloadFile("https://raw.githubusercontent.com/DataDog/datadog-serverless-functions/master/azure/eventhub_log_forwarder/activity_logs_deploy.ps1", "activity_logs_deploy.ps1")
You can also view the contents of the script.
<API_KEY>
, with your Datadog API token, and <SUBSCRIPTION_ID>
, with your Azure Subscription ID. Add Optional Parameters to configure your deployment.Activity Logs Step 2
./activity_logs_deploy.ps1 -ApiKey <API_KEY> -SubscriptionId <SUBSCRIPTION_ID>
To send Azure platform logs (including resource logs), you can deploy an Event Hub and log forwarder function pair. After deploying, create diagnostic settings for each of the log sources to stream logs to Datadog.
In the Azure portal, navigate to your Cloud Shell.
Run the Powershell command below to download the automation script into your Cloud Shell environment.
Platform Logs Step 1
(New-Object System.Net.WebClient).DownloadFile("https://raw.githubusercontent.com/DataDog/datadog-serverless-functions/master/azure/eventhub_log_forwarder/resource_deploy.ps1", "resource_deploy.ps1")
You can also view the contents of the script.
<API_KEY>
, with your Datadog API token, and <SUBSCRIPTION_ID>
, with your Azure Subscription ID. You can also add other optional parameters to configure your deployment. See Optional Parameters.Platform Logs Step 2
./resource_deploy.ps1 -ApiKey <API_KEY> -SubscriptionId <SUBSCRIPTION_ID>
All of the Azure resources deployed for the Platform Logs pipeline contain its ResourceGroup-Location appended to its default name. For example, datadog-eventhub-westus
. However, you can alter this convention by overriding the parameter.
Note: Resources can only stream to Event Hubs in the same Azure region, so you need to replicate step 2 for each region you want to stream resource logs from.
To stream both activity logs and resource logs, run the first script including the optional parameter -ResourceGroupLocation <REGION>
. Activity logs are a subscription-level source, so you can create your pipeline for them in any region. Once this is deployed, send resource logs through the same Event Hub by adding diagnostic settings on your resources in westus
.
Note: This integration does not collect events.
Note: Ensure that your custom resource names are unique when you customize the following parameters. Validate that the resource name does not already exist within your list of other Azure resources.
-Flag <Default Parameter> | Description |
---|---|
-DatadogSite <datadoghq.com> | Customize your Datadog instance by adding this flag with another Datadog site as a parameter. Your Datadog site is: . |
-Environment <AzureCloud> | Manage storage in Azure independent clouds by adding this flag as a parameter. Additional options are AzureChinaCloud , AzureGermanCloud , and AzureUSGovernment . |
-ResourceGroupLocation <westus2> | You can choose the region in which your Azure resource group and resources are getting deployed by adding this flag with an updated Azure region. |
-ResourceGroupName <datadog-log-forwarder-rg> | Customize the name of your Azure resource group by adding this flag with an updated parameter. |
-EventhubNamespace <datadog-ns-4c6c53b4-1abd-4798-987a-c8e671a5c25e> | Customize your Azure Event Hub namespace by adding this flag with an updated parameter. By default, datadog-ns-<globally-unique-ID> is generated. |
-EventhubName <datadog-eventhub> | Customize the name of your Azure Event Hub by adding this flag with an updated parameter. |
-FunctionAppName <datadog-functionapp-1435ad2f-7c1f-470c-a4df-bc7289d8b249> | Customize the name of your Azure function app by adding this flag with an updated parameter. By default, datadog-functionapp-<globally-unique-ID> is generated. |
-FunctionName <datadog-function> | Customize the name of your Azure Function by adding this flag with an updated parameter. |
-DiagnosticSettingName <datadog-activity-logs-diagnostic-setting> | Customize the name of your Azure diagnostic setting by adding this flag with an updated parameter. (Only relevant for sending activity logs) |
Installation errors? See [Automated log collection][1] for common error cases.
This section describes the manual setup process to forward your Azure logs to Datadog:
The instructions below walk through a basic, initial setup using the Azure Portal. All of these steps can be performed with the CLI, Powershell, or resource templates by referring to the Azure documentation.
If you already have an Event Hubs namespace configured with an Event Hub connection string, skip to Add an Event Hub to your Event Hubs namespace.
See the Azure Event Hubs Quickstart for additional information.
If you already have a function app configured with an Event Hub connection string, skip to Add a new function to your Function App using the Event Hub trigger template.
Node.js
c. For Version, select 18 LTS
.See Azure Event Hubs trigger for Azure Functions for more information.
Note: If you don’t want to paste your Datadog API key value directly into the function’s code, create an additional environment variable for the Datadog API key value.
See Getting started with Azure functions for more information.
index.js
file.DD_API_KEY
environment variable, or copy it into the function code by replacing <DATADOG_API_KEY>
on line 21.DD_SITE
environment variable under the configuration tab of your function app, or copy the site parameter into the function code on line 22.eventHubMessages
.Many
.RootManageSharedAccessKey
if desired. Optionally, create your own shared access policy at the Event Hub namespace level:See Diagnostic settings in Azure monitor for more information.
Configure your Azure resources to forward their logs to the Event Hub with a diagnostic setting.
RootManageSharedAccessKey
if desired. Optionally, create your own shared access policy at the Event Hub namespace level:See Diagnostic settings in Azure monitor for more information.
Datadog recommends using the Event Hub setup for Azure log collection. However, you can also follow the steps in this section to forward all of your Azure App Services logs from Azure Blob Storage:
If you already have a function app configured for this purpose, skip to Add a new function to your Function App using the Event Hub trigger template.
Node.js
18 LTS
.Windows
.See Getting started with Azure Functions for more information.
index.js
file.DD_API_KEY
environment variable, or copy it into the function code by replacing <DATADOG_API_KEY>
on line 20.DD_SITE
environment variable under the configuration tab of your function app, or copy the site parameter into the function code on line 21.blobContent
and click Save.Archiving logs to Azure Blob Storage requires an App Registration even if you are using the Azure Native integration. To archive logs to Azure Blob Storage, follow the setup instructions to configure the integration using an App Registration. App Registrations created for archiving purposes do not need the Monitoring Reader
role assigned.
Once you have an App Registration configured, you can create a log archive that writes to Azure Blob Storage.
Note: If your storage bucket is in a subscription being monitored through the Azure Native integration, a warning is displayed in the Azure Integration Tile about the App Registration being redundant. You can ignore this warning.
Additional helpful documentation, links, and articles: