Send Azure Logs to Datadog
Cette page n'est pas encore disponible en français, sa traduction est en cours.
Si vous avez des questions ou des retours sur notre projet de traduction actuel,
n'hésitez pas à nous contacter.
Automated Log Forwarding for Azure (in Preview)
Automatically set up log forwarding across your Azure environment—no manual configuration required! This feature automatically manages and scales log forwarding services.
Request AccessOverview
Use this guide to set up logging from your Azure subscriptions to Datadog.
Datadog recommends using the Agent or DaemonSet to send logs from Azure. If direct streaming isn’t possible, create a log forwarding pipeline using an Azure Event Hub to collect Azure Platform Logs. For resources that cannot stream to an Event Hub, use the Blob Storage forwarding option. To collect logs from Azure Log Analytics workspaces, use the Azure Event Hub process.
Follow these steps to send Azure logs to any Datadog site.
US3: Organizations on the Datadog US3 site can simplify Azure log forwarding using the Azure Native integration. This method is recommended and is configured through the Datadog resource in Azure, replacing the Azure Event Hub process. See the Azure Native Logging Guide for more details.
Starting April 30, 2025, Azure no longer supports Node.js 18. To ensure compatibility, update using the Azure Resource Manager (ARM) template with the same parameters.
Setup
To get started, click the button below and fill in the form on Azure Portal. The Azure resources required to get activity logs streaming into your Datadog account will be deployed for you. To forward Activity Logs, set the Send Activity Logs option to true.

After the template deployment finishes, set up diagnostic settings for each log source to send Azure platform logs (including resource logs) to the Event Hub created during deployment.
Note: Resources can only stream to Event Hubs in the same Azure region.
If you run into any problems during deployment, see Automated log collection for common error cases.
Datadog recommends using the Event Hub setup for Azure log collection. However, you can also follow the steps in this section to forward all of your Azure App Services logs from Azure Blob Storage:
- If you haven’t already set up Azure Blob Storage, use one of the following methods to get started:
- Set up the Datadog-Azure Function to forward logs from Blob Storage using the instructions below.
- Configure your Azure App Services to forward their logs to Blob Storage.
Create a function app
If you already have a function app configured for this purpose, skip to Add a new function to your Function App using the Event Hub trigger template.
- In the Azure portal, navigate to the Function App overview and click Create.
- In the Instance Details section, configure the following settings:
a. Select the Code radio button
b. For Runtime stack, select Node.js
c. For Version, select 18 LTS
.
d. For Operating System, select Windows
. - Configure other settings as desired.
- Click Review + create to validate the resource. If validation is successful, click Create.
Add a new function to your Function App using the Azure Blob Storage trigger template
- Select your new or existing function app from the Function App overview.
- Under the Functions tab, click Create.
- For the Development environment field, select Develop in portal.
- Under Select a template, choose Azure Blob storage trigger.
- Select your Storage account connection.
Note: See Configure a connection string for an Azure storage account for more information.
- Click Create.
See Getting started with Azure Functions for more information.
Point your Blob Storage trigger to Datadog
- On the detail page of your Event Hub trigger function, click Code + Test under the Developer side menu.
- Add the Datadog-Azure Function code to the function’s
index.js
file. - Add your Datadog API key with a
DD_API_KEY
environment variable, or copy it into the function code by replacing <DATADOG_API_KEY>
on line 20. - If you’re not using the Datadog US1 site, set your Datadog site with a
DD_SITE
environment variable under the configuration tab of your function app, or copy the site parameter into the function code on line 21. - Save the function.
- Click Integration under the Developer side menu.
- Click Azure Blob Storage under Trigger and inputs.
- Set the Blob Parameter Name to
blobContent
and click Save. - Verify your setup is correct by checking the Datadog Log Explorer for logs from this resource.
Advanced configuration
Refer to the following topics to configure your installation according to your monitoring needs.
PCI compliance
PCI DSS compliance for APM and Log Management is only available for Datadog organizations in the
US1 site.
To set up PCI-compliant Log Management, you must meet the requirements outlined in PCI DSS Compliance. Send your logs to the dedicated PCI compliant endpoint:
Under Settings > Environment variables, click Add to set the following environment variable:
- Name:
DD_URL
- Value:
http-intake-pci.logs.datadoghq.com
Log Archiving
Archiving logs to Azure Blob Storage requires an App Registration even if you are using the Azure Native integration. To archive logs to Azure Blob Storage, follow the automatic or manual setup instructions to configure the integration using an App Registration. App Registrations created for archiving purposes do not need the Monitoring Reader
role assigned.
After configuring an App Registration, you can create a log archive that writes to Azure Blob Storage.
Note: If your storage bucket is in a subscription being monitored through the Azure Native integration, a warning is displayed in the Azure Integration Tile about the App Registration being redundant. You can ignore this warning.
Further Reading
Documentation, liens et articles supplémentaires utiles: