Use this guide to set up logging from your Azure subscriptions to Datadog.
Datadog recommends sending logs from Azure to Datadog with the Agent or DaemonSet. For some resources it may not be possible. In these cases, you can create a log forwarding pipeline using an Azure Event Hub to collect Azure Platform Logs. For resources that cannot stream Azure Platform Logs to an Event Hub, you can use the Blob Storage forwarding option.
All sites: All Datadog sites can use the steps on this page to send Azure logs to Datadog.
US3: If your organization is on the Datadog US3 site, you can use the Azure Native integration to simplify configuration for your Azure log forwarding. Datadog recommends using this method when possible. Configuration is done through the Datadog resource in Azure. This replaces the Azure Event Hub process for log forwarding. See the Azure Native Logging Guide for more information.
To get started, click the button below and fill in the form on Azure Portal. The Azure resources required to get activity logs streaming into your Datadog account will be deployed for you.
Alternatively, Datadog provides automated scripts you can use for sending Azure activity logs and Azure platform logs (including resource logs).
Azure activity logs
Follow these steps to run the script that creates and configures the Azure resources required to stream activity logs into your Datadog account. These resources include activity log diagnostic settings, Azure Functions, Event Hub namespaces, and Event Hubs.
In the Azure portal, navigate to your Cloud Shell.
Run the command below to download the automation script into your Cloud Shell environment.
Invoke the script by running the command below, while replacing <API_KEY>, with your Datadog API token, and <SUBSCRIPTION_ID>, with your Azure Subscription ID. Add Optional Parameters to configure your deployment.
To send Azure platform logs (including resource logs), you can deploy an Event Hub and log forwarder function pair.
After deploying, create diagnostic settings for each of the log sources to stream logs to Datadog.
In the Azure portal, navigate to your Cloud Shell.
Run the Powershell command below to download the automation script into your Cloud Shell environment.
Invoke the script by running the Powershell command below, replacing <API_KEY>, with your Datadog API token, and <SUBSCRIPTION_ID>, with your Azure Subscription ID. You can also add other optional parameters to configure your deployment. See Optional Parameters.
Create diagnostic settings for all Azure resources sending logs to Datadog. Configure these diagnostic settings to stream to the Event Hub you just created.
All of the Azure resources deployed for the Platform Logs pipeline contain its ResourceGroup-Location appended to its default name. For example, datadog-eventhub-westus. However, you can alter this convention by overriding the parameter.
Note: Resources can only stream to Event Hubs in the same Azure region, so you need to replicate step 2 for each region you want to stream resource logs from.
Set up both activity and resource logs
To stream both activity logs and resource logs, run the first script including the optional parameter -ResourceGroupLocation <REGION>. Activity logs are a subscription-level source, so you can create your pipeline for them in any region. Once this is deployed, send resource logs through the same Event Hub by adding diagnostic settings on your resources in westus.
Note: This integration does not collect events.
Optional parameters
Note: Ensure that your custom resource names are unique when you customize the following parameters. Validate that the resource name does not already exist within your list of other Azure resources.
-Flag <Default Parameter>
Description
-DatadogSite <datadoghq.com>
Customize your Datadog instance by adding this flag with another Datadog site as a parameter. Your Datadog site is: .
-Environment <AzureCloud>
Manage storage in Azure independent clouds by adding this flag as a parameter. Additional options are AzureChinaCloud, AzureGermanCloud, and AzureUSGovernment.
-ResourceGroupLocation <westus2>
You can choose the region in which your Azure resource group and resources are getting deployed by adding this flag with an updated Azure region.
-ResourceGroupName <datadog-log-forwarder-rg>
Customize the name of your Azure resource group by adding this flag with an updated parameter.
Customize the name of your Azure function app by adding this flag with an updated parameter. By default, datadog-functionapp-<globally-unique-ID> is generated.
-FunctionName <datadog-function>
Customize the name of your Azure Function by adding this flag with an updated parameter.
Configure your Azure services to stream logs to the Event Hub by creating a diagnostic setting.
The instructions below walk through a basic, initial setup using the Azure Portal. All of these steps can be performed with the CLI, Powershell, or resource templates by referring to the Azure documentation.
Create a new namespace or add a new Event Hub to an existing namespace by following the instructions below.
In the Azure portal, navigate to the Event Hubs overview and click Create.
Enter the name, pricing tier, subscription, and resource group.
Select Location. Note: The Event Hub must be in the same Location as the resource you want to submit logs from. For activity logs or other account-wide log sources, you can choose any region.
Select your desired options for throughput units, availability-zones, and auto-inflation.
Click Create.
Add an Event Hub to your Event Hub namespace.
In the Azure portal, navigate to a new or existing namespace.
Click + Event Hub.
Select your desired options for name, partition-count, and message-retention.
Add your API key by creating a DD_API_KEY environment variable under the configuration tab of your function app, or copy it into the function code by replacing <DATADOG_API_KEY> on line 22.
If you’re not using the Datadog US1 site, set your Datadog site with a DD_SITE environment variable under the configuration tab of your function app, or copy the site parameter into the function code on line 23.
Save the function.
Click on Integration then Azure Event Hubs under trigger and check the following settings: a. Event Parameter Name is set to eventHubMessages. b. Event Hub Cardinality is set to Many. c. Event Hub Data Type is left empty.
Click Save.
Verify your setup is correct by running the function and then checking the Datadog log explorer for the test message. Note: The test log event must be in valid JSON format.
Activity logs
In the Azure portal, navigate to the Activity Log.
Click on Diagnostic Settings.
Click Add diagnostic setting.
Under category details, select the categories of logs you want to send to Datadog.
Under destination details, select Stream to an event hub.
Set the Event Hub namespace and name. These should match the Event Hub namespace and name that you used to create your Event Hub trigger.
Set the shared access key. This key should be configured with send or manage access.
Click Save.
Verify your setup is correct by checking the Datadog log explorer for logs from this resource.
Resource logs
Configure your Azure services to forward their logs to the Event Hub by creating a diagnostic setting.
In the Azure portal, navigate to the resource of the logs you want to send to Datadog.
Under the monitoring section of the resource blade, click Diagnostic settings.
Click Add diagnostic setting.
Under category details, select the categories of logs you want to send to Datadog.
Under destination details, select Stream to an event hub.
Set the Event Hub namespace and name. These should match the Event Hub namespace and name that you used to create your Event Hub trigger.
Set the shared access key. This key should be configured with send or manage access.
Click Save.
Verify your setup is correct by checking the Datadog log explorer for logs from this resource.
This is not supported for Datadog site.
Datadog recommends using the Event Hub setup for Azure log collection. However, you can also follow the steps below to forward all of your Azure App Services logs from Blob storage:
In the Azure portal, navigate to the Function Apps overview and click Create.
Select a subscription, resource group, region, and enter a name for your function apps.
Select Publish to Code, Runtime stack to Node.js, and Version to 18 LTS.
Select Operating System Windows and a plan type.
Click Next:Hosting.
Select a storage account.
Review and Create the new function.
Once deployment has finished, select your new function from the function apps list.
Select to build your function In-portal and use the Blog Storage trigger template (under More templates…). If prompted, install the Microsoft.Azure.WebJobs.Extensions.EventHubs extension.
Select or add your Storage account connection and click Create.
Under Integrate, set the Blob Parameter Name to blobContent and click Save.
Verify your setup is correct by checking the Datadog Log explorer for your logs.
Log Archiving
Archiving logs to Azure Blob Storage requires an App Registration even if you are using the Azure Native integration. To archive logs to Azure Blob Storage, follow the setup instructions to configure the integration using an App Registration. App Registrations created for archiving purposes do not need the Monitoring Reader role assigned.
Once you have an App Registration configured, you can create a log archive that writes to Azure Blob Storage.
Note: If your storage bucket is in a subscription being monitored through the Azure Native integration, a warning is displayed in the Azure Integration Tile about the App Registration being redundant. You can ignore this warning.
Further Reading
Additional helpful documentation, links, and articles: