Cette page n'est pas encore disponible en français, sa traduction est en cours. Si vous avez des questions ou des retours sur notre projet de traduction actuel, n'hésitez pas à nous contacter.
Overview
Use this guide to set up logging from your Azure subscriptions to Datadog.
Datadog recommends using the Agent or DaemonSet to send logs from Azure. If direct streaming isn’t possible, create a log forwarding pipeline using an Azure Event Hub to collect Azure Platform Logs. For resources that cannot stream to an Event Hub, use the Blob Storage forwarding option. To collect logs from Azure Log Analytics workspaces, use the Azure Event Hub process.
Follow these steps to send Azure logs to any Datadog site.
US3: Organizations on the Datadog US3 site can simplify Azure log forwarding using the Azure Native integration. This method is recommended and is configured through the Datadog resource in Azure, replacing the Azure Event Hub process. See the Azure Native Logging Guide for more details.
Setup
To get started, click the button below and fill in the form on Azure Portal. The Azure resources required to get activity logs streaming into your Datadog account will be deployed for you.
Alternatively, Datadog provides automated scripts you can use for sending Azure activity logs and Azure platform logs (including resource logs).
Azure activity logs
Follow these steps to run the script that creates and configures the Azure resources required to stream activity logs into your Datadog account. These resources include activity log diagnostic settings, Azure Functions, Event Hub namespaces, and Event Hubs.
In the Azure portal, navigate to your Cloud Shell.
Run the command below to download the automation script into your Cloud Shell environment. You can also view the contents of the script.
Invoke the script by running the command below, while replacing <API_KEY>, with your Datadog API token, and <SUBSCRIPTION_ID>, with your Azure Subscription ID. Add Optional Parameters to configure your deployment.
To send Azure platform logs (including resource logs), you can deploy an Event Hub and log forwarder function pair.
After deploying, create diagnostic settings for each of the log sources to stream logs to Datadog.
Note: Resources can only stream to Event Hubs in the same Azure region.
In the Azure portal, navigate to your Cloud Shell.
Run the PowerShell command below to download the automation script into your Cloud Shell environment. You can also view the contents of the script.
Invoke the script by running the PowerShell command below, replacing <API_KEY>, with your Datadog API token, and <SUBSCRIPTION_ID>, with your Azure Subscription ID. You can also add other optional parameters to configure your deployment. See Optional Parameters.
Create diagnostic settings for all Azure resources sending logs to Datadog. Configure these diagnostic settings to stream to the Event Hub you just created.
All of the Azure resources deployed for the Platform Logs pipeline contain its ResourceGroup-Location appended to its default name. For example, datadog-eventhub-westus. However, you can alter this convention by overriding the parameter.
Note: Resources can only stream to Event Hubs in the same Azure region, so you need to replicate step 2 for each region you want to stream resource logs from.
Set up both activity and resource logs
To stream both activity logs and resource logs, run the first script including the optional parameter -ResourceGroupLocation <REGION>. Activity logs are a subscription-level source, so you can create your pipeline for them in any region. Once this is deployed, send resource logs through the same Event Hub by adding diagnostic settings on your resources in westus.
Note: This integration does not collect events.
Optional parameters
Note: Ensure that your custom resource names are unique when you customize the following parameters. Validate that the resource name does not already exist within your list of other Azure resources.
-Flag <Default Parameter>
Description
-DatadogSite <datadoghq.com>
Customize your Datadog instance by adding this flag with another Datadog site as a parameter. Your Datadog site is: .
-Environment <AzureCloud>
Manage storage in Azure independent clouds by adding this flag as a parameter. Additional options are AzureChinaCloud, AzureGermanCloud, and AzureUSGovernment.
-ResourceGroupLocation <westus2>
You can choose the region in which your Azure resource group and resources are getting deployed by adding this flag with an updated Azure region.
-ResourceGroupName <datadog-log-forwarder-rg>
Customize the name of your Azure resource group by adding this flag with an updated parameter.
Customize the name of your Azure function app by adding this flag with an updated parameter. By default, datadog-functionapp-<globally-unique-ID> is generated.
-FunctionName <datadog-function>
Customize the name of your Azure Function by adding this flag with an updated parameter.
The instructions below walk through a basic, initial setup using the Azure Portal. All of these steps can be performed with the CLI, PowerShell, or resource templates by referring to the Azure documentation.
Note: Resources can only stream to Event Hubs in the same Azure region.
In the Azure portal, navigate to the Event Hubs overview and click Create.
Fill in the Project Details and Instance Details sections as desired. Note: If you plan to collect Azure resource logs, the Event Hub must be in the same Location as the resource you want to collect logs from. For activity logs or other account-wide log sources, you can choose any region.
Click Review + create to validate the resource. If validation is successful, click Create.
In the Azure portal, navigate to your new or existing Event Hubs namespace.
Click + Event Hub.
Configure the Basics and Capture tabs as desired.
Click Review + create to validate the resource. If validation is successful, click Create.
Configure shared access
In the detail page of your Event Hub, click Shared access policies under the Settings tab to the left.
Click + Add.
Provide a policy name and select Listen.
Copy the Connection string-primary key value and keep it somewhere safe. This is needed to allow the Datadog-Azure function to communicate with the Event Hub.
In the Instance Details section, configure the following settings:
a. Select the Code radio button
b. For Runtime stack, select Node.js
c. For Version, select 18 LTS.
Configure other settings as desired.
Click Review + create to validate the resource. If validation is successful, click Create.
Note: If you don’t want to paste your Datadog API key value directly into the function’s code, create an additional environment variable for the Datadog API key value.
Add a new function to your Function App using the Event Hub trigger template
Add your Datadog API key through a DD_API_KEY environment variable, or copy it into the function code by replacing <DATADOG_API_KEY> on line 21.
If you’re not using the Datadog US1 site, set your Datadog site with a DD_SITE environment variable under the configuration tab of your function app, or copy the site parameter into the function code on line 22.
Save the function.
Click Integration under the Developer side menu.
Click Azure Event Hubs under Trigger and inputs.
Confirm the following settings are in place: a. Event hub connection is set to the name of your connection string environment variable. b. Event parameter name is set to eventHubMessages. c. Event hub name is set to the name of your Event Hub. d. Event hub cardinality is set to Many. e. Event hub data type is left empty.
To validate your setup, click Code + Test under the Developer side menu.
Click Test/Run and enter a test message in valid JSON format.
In the Azure portal, navigate to the Activity log.
Click Export Activity Logs.
Click + Add diagnostic setting.
Under Categories, select the categories of logs you want to send to Datadog.
Under Destination details, select Stream to an event hub.
Set the Event hub namespace and Event hub name with the names of the Event Hub namespace and Event Hub name, respectively, that were used to create your Event Hub trigger.
For Event hub policy name, you can select RootManageSharedAccessKey if desired. Optionally, create your own shared access policy at the Event Hub namespace level: a. In the Event Hub namespace, click Shared access policies under the Settings tab to the left. b. Click + Add. c. Provide a policy name and select Send or Manage. d. Click Save. e. Return to the diagnostic setting page and select your shared access policy for the Event hub policy name field. You may need to refresh the page. Note: See Authorizing access to Event Hubs resources using Shared Access Signatures for more information.
Verify your setup is correct by checking the Datadog Log Explorer for your activity logs.
Configure your Azure resources to forward their logs to the Event Hub with a diagnostic setting.
In the Azure portal, navigate to the resource that you want to forward logs to Datadog.
In the Monitoring section of the resource blade, click Diagnostic settings.
Click Add diagnostic setting.
Provide a name and select the sources of the data you want to forward..
Under Destination details, select Stream to an event hub.
Set the Event hub namespace and Event hub name with the names of the Event Hub namespace and Event Hub name, respectively, that were used to create your Event Hub trigger.
For Event hub policy name, you can select RootManageSharedAccessKey if desired. Optionally, create your own shared access policy at the Event Hub namespace level: a. In the Event Hub namespace, click Shared access policies under the Settings tab to the left. b. Click + Add. c. Provide a policy name and select Send or Manage. d. Click Save. e. Return to the diagnostic setting page and select your shared access policy for the Event hub policy name field. You may need to refresh the page. Note: See Authorizing access to Event Hubs resources using Shared Access Signatures for more information.
Click Save.
Verify your setup is correct by checking the Datadog Log Explorer for logs from this resource.
Datadog recommends using the Event Hub setup for Azure log collection. However, you can also follow the steps in this section to forward all of your Azure App Services logs from Azure Blob Storage:
If you haven’t already set up Azure Blob Storage, use one of the following methods to get started:
In the Instance Details section, configure the following settings: a. Select the Code radio button b. For Runtime stack, select Node.js c. For Version, select 18 LTS. d. For Operating System, select Windows.
Configure other settings as desired.
Click Review + create to validate the resource. If validation is successful, click Create.
Add a new function to your Function App using the Azure Blob Storage trigger template
Add your Datadog API key with a DD_API_KEY environment variable, or copy it into the function code by replacing <DATADOG_API_KEY> on line 20.
If you’re not using the Datadog US1 site, set your Datadog site with a DD_SITE environment variable under the configuration tab of your function app, or copy the site parameter into the function code on line 21.
Save the function.
Click Integration under the Developer side menu.
Click Azure Blob Storage under Trigger and inputs.
Set the Blob Parameter Name to blobContent and click Save.
Verify your setup is correct by checking the Datadog Log Explorer for logs from this resource.
Advanced configuration
Refer to the following topics to configure your installation according to your monitoring needs.
PCI compliance
PCI DSS compliance for APM and Log Management is only available for Datadog organizations in the US1 site.
To set up PCI-compliant Log Management, you must meet the requirements outlined in PCI DSS Compliance. Send your logs to the dedicated PCI compliant endpoint:
agent-http-intake-pci.logs.datadoghq.com:443 for Agent traffic
http-intake-pci.logs.datadoghq.com:443 for non-Agent traffic
Archiving logs to Azure Blob Storage requires an App Registration even if you are using the Azure Native integration. To archive logs to Azure Blob Storage, follow the setup instructions to configure the integration using an App Registration. App Registrations created for archiving purposes do not need the Monitoring Reader role assigned.
Once you have an App Registration configured, you can create a log archive that writes to Azure Blob Storage.
Note: If your storage bucket is in a subscription being monitored through the Azure Native integration, a warning is displayed in the Azure Integration Tile about the App Registration being redundant. You can ignore this warning.
Further Reading
Documentation, liens et articles supplémentaires utiles: