Microsoft Azure

Overview

Datadog’s Azure integration enables the collection of metrics and logs from your Azure environment. The configuration options are different depending on which Datadog site your organization is using:

All Sites: All Datadog sites can use the App Registration credential process for implementing metric collection, and the Event Hub setup for sending Azure Platform Logs.
US3: If your org is on the Datadog US3 site, use the Azure Native integration to streamline management and data collection for your Azure environment. Datadog recommends using this method when possible. Setup entails creating a Datadog resource in Azure to link your Azure subscriptions to your Datadog organization. This replaces the App Registration credential process for metric collection and Event Hub setup for log forwarding.

Select site US3 in the side panel of this page or change the site selector to ensure you're seeing the US3 version of the documentation.

Connect to Microsoft Azure to:

  • Get metrics from Azure VMs with or without installing the Datadog Agent.
  • Collect standard Azure Monitor metrics for all Azure services: Application Gateway, App Service (Web & Mobile), Batch Service, Event Hub, IoT Hub, Logic App, Redis Cache, Server Farm (App Service Plan), SQL Database, SQL Elastic Pool, Virtual Machine Scale Set, and many more.
  • Tag your Azure metrics with Azure-specific information about the associated resource, such as region, resource group, and custom Azure tags.
  • Get Datadog generated metrics to provide unique insights into your Azure environment.
  • Correlate data from your Azure applications across logs, metrics, APM tracing, user activity, and more within your Datadog organization.
Datadog's Azure integration is built to collect all metrics from Azure Monitor. Datadog strives to continually update the docs to show every sub-integration, but cloud services rapidly release new metrics and services so the list of integrations can sometimes lag.
The azure.*.status and azure.*.count metrics are generated by Datadog from Azure Resource Health. For more information, see Azure Status and Count Metrics.
IntegrationDescription
Analysis ServicesA service that provides data models in the cloud.
API ManagementA service to publish, secure, transform, maintain, and monitor APIs.
App ServiceA service for deploying and scaling web, mobile, API and business logic application.
App Service EnvironmentA service that provides an environment for securely running App Service apps at high scale.
App Service PlanA set of compute resources for a web app to run.
Application GatewayA web traffic load balancer that enables you to manage traffic to your web applications.
AutomationA service that provides automation and configuration management across your environments.
Batch ServiceManaged task scheduler and processor.
Cognitive ServicesAPIs, SDKs, and services available to help build applications without AI or data science knowledge.
Container InstancesA service to deploy containers without the need to provision or manage the underlying infrastructure.
Container ServiceA production-ready Kubernetes, DC/OS, or Docker Swarm cluster.
Cosmos DBA database service that supports document, key-value, wide-column, and graph databases.
Customer InsightsEnables organizations to bring together data sets to build a 360° view of their customers.
Data ExplorerFast and highly scalable data exploration service.
Data FactoryA service to compose data storage, movement, and processing services into automated data pipelines.
Data Lake AnalyticsAn analytics job service that simplifies big data.
Data Lake StoreA no limits data lake that powers big data analytics.
Database for MariaDBA service that provides fully managed, enterprise-ready community MariaDB database.
Event GridAn event routing service that allows for uniform event consumption using a publish-subscribe model.
Event HubLarge scale data stream managed service.
ExpressRouteA service to extend your on-premises networks into the cloud.
FirewallCloud-native network security to protect your Azure Virtual Network resources.
FunctionsA service for running serverless code in response to event triggers.
HDInsightsA cloud service that processes massive amounts of data.
IOT HubConnect, monitor, and manage billions of IOT assets.
Key VaultA service to safeguard and manage cryptographic keys and secrets used by cloud applications and services.
Load BalancerScale your applications and create high availability for your services.
Logic AppBuild powerful integration solutions.
Machine LearningEnterprise-grade machine learning service to build and deploy models faster.
Network InterfacesEnables VM communication with internet, Azure, and on-premises resources.
Notification HubsA push engine that allows you to send notifications to any platform from any backend.
Public IP AddressA resource that enables inbound communication and outbound connectivity from the Internet.
Recovery Service VaultAn entity that stores the backups and recovery points created over time.
Redis CacheManaged data cache.
RelaySecurely expose services that run in your corporate network to the public cloud.
Cognitive SearchA search-as-a-service cloud solution that provides tools for adding a rich search experience.
StorageStorage for blobs, files, queues, and tables.
Stream AnalyticsAn event-processing engine to examine high volumes of data streaming from devices.
SQL DatabaseHighly scalable relational database in the cloud.
SQL Database Elastic PoolManage the performance of multiple database.
Synapse AnalyticsAn analytics service that brings together data integration, enterprise data warehousing and big data analytics.
Usage and QuotasFollow your Azure usage.
Virtual MachineVirtual machine management service.
Virtual Machine Scale SetDeploy, manage, and autoscale a set of identical VMs.
Virtual NetworkAllow Azure resources to securely communicate with each other, the internet, and on-premises networks.

Setup

The Azure Native integration (available for customers on Datadog's US3 site) has different setup instructions. Select site US3 in the side panel of this page or change the site selector to ensure you're seeing the US3 version of the documentation.

Installation

Integrate your Microsoft Azure account with Datadog using the Azure CLI tool or the Azure portal. This integration method works automatically for all Azure Clouds: Public, China, German, and Government.

Follow the instructions below and Datadog detects automatically which Cloud you are using to complete the integration.

Integrating through the Azure CLI

To integrate Datadog with Azure using the Azure CLI, Datadog recommends using the Azure Cloud Shell.

First, log in to the Azure account you want to integrate with Datadog:

az login

Create a service principal and configure its access to Azure resources:

az ad sp create-for-rbac

Display a list of subscriptions so that you can copy and paste the subscription_id:

az account list --output table

Create an application as a service principal using the format:

az ad sp create-for-rbac --role "Monitoring Reader" --scopes /subscriptions/{subscription_id}

Example Output:

{
  "appId": "0dd17b1e-54a4-45ae-b168-232b14b01f88",
  "displayName": "azure-cli-2025-02-23-04-27-19",
  "password": "dj-8Q~hKbQwU93Q0FBfIZ_pI5ZtaLoRxaws8Dca5",
  "tenant": "4d3bac44-0230-4732-9e70-cc00736f0a97"
}
  • This command grants the Service Principal the monitoring reader role for the subscription you would like to monitor.
  • The appID generated from this command must be entered in the Datadog Azure Integration tile under Client ID.
  • Enter the generated Tenant ID value in the Datadog Azure Integration tile under Tenant name/ID.
  • --scopes can support multiple values, and you can add multiple subscriptions or Management Groups at once. See the examples in the az ad sp documentation.
  • Add --name <CUSTOM_NAME> to use a hand-picked name, otherwise Azure generates a unique one. The name is not used in the setup process.
  • Add --password <CUSTOM_PASSWORD> to use a hand-picked password. Otherwise Azure generates a unique one. This password must be entered in the Datadog Azure Integration tile under Client Secret.

Management Group is a valid and recommended option for scope. For example:

az account management-group entities list --query "[?inheritedPermissions!='noaccess' && permissions!='noaccess'].{Name:displayName,Id:id}" --output table
  • This command displays all the subscriptions and management groups a user has access to.
Quick launch of the Azure CLI commands

You can also combine the above commands into a single command:

az ad sp create-for-rbac --role "Monitoring Reader" --scopes `az account management-group entities list --query "[?inheritedPermissions!='noaccess' && permissions!='noaccess'].id | join(' ', @)" -o tsv`
  • This command joins those IDs together and creates the Service Principal.
  • It creates a user and assigns roles to every management group and subscription you have access to.

First, log in to the Azure account you want to integrate with Datadog:

azure login

Run the account show command:

az account show

Enter the generated Tenant ID value in the Datadog Azure Integration tile under Tenant name/ID.

Create a name and password:

azure ad sp create -n <NAME> -p <PASSWORD>
  • The <NAME> is NOT used but is required as part of the setup process.
  • The <PASSWORD> you choose must be entered in the Datadog Azure Integration tile under Client Secret.
  • The Object Id returned from this command is used in place of <OBJECT_ID> in the next command.

Create an application as a service principal using the format:

azure role assignment create --objectId <OBJECT_ID> -o "Monitoring Reader" -c /subscriptions/<SUBSCRIPTION_ID>/
  • This command grants the Service Principal the monitoring reader role for the subscription you would like to monitor.
  • The Service Principal Name generated from this command must be entered in the Datadog Azure Integration tile under Client ID.
  • <SUBSCRIPTION_ID> is the Azure subscription you would like to monitor, and is listed as ID with azure account show or in the portal.

Integrating through the Azure portal

  1. Create an app registration in your Active Directory and pass the correct credentials to Datadog.
  2. Give the application read-access to any subscriptions you would like to monitor.
Creating the app registration
  1. Under Azure Active Directory, navigate to App Registrations and click New registration.

  2. Enter the following and click the Create button. The name and sign-on URL are not used but are required for the setup process.

    • Name: Datadog Auth
    • Supported Account Types: Accounts in this organizational directory only (Datadog)
    • Redirect URI:
Azure create app
Giving read permissions to the application
  1. To assign access at the individual subscription level, navigate to Subscriptions through the search box or the left sidebar.
Subscriptions icon

To assign access at the Management Group level, navigate to Management Groups and select the Management Group that contains the set of subscriptions you would like to monitor.
Note: Assigning access at the Management Group level means that any new subscriptions added to the group are automatically discovered and monitored by Datadog.

Management groups icon

To configure monitoring for the entire tenant, assign access to the Tenant Root Group.

  1. Click on the subscription you would like to monitor.

  2. Select Access control (IAM) in the subscription menu and click Add > Add role assignment:

    Add Role Assignment
  3. For Role, select Monitoring Reader. Under Select, choose the name of the Application you just created:

  4. Click Save.

  5. Repeat this process for any additional subscriptions you want to monitor with Datadog. Note: Users of Azure Lighthouse can add subscriptions from customer tenants.

Note: Diagnostics must be enabled for ARM deployed VMs to collect metrics, see Enable diagnostics.

Completing the integration
  1. Under App Registrations, select the App you created, copy the Application ID and Tenant ID, and paste the values in the Datadog Azure Integration tile under Client ID and Tenant ID.

  2. For the same app, go to Manage > Certificates and secrets.

  3. Add a new Client Secret called datadogClientSecret, select a timeframe for Expires, and click Add:

    Azure client secret
  4. When the key value is shown, copy and paste the value in the Datadog Azure Integration tile under Client Secret and click Install Integration or Update Configuration.

Note: Your updates to the Azure configuration can take up to 20 minutes to be reflected in Datadog.

Configuration

The Azure integration tile includes configurable options and settings. All settings are configured at the App Registration level and apply to all subscriptions within scope.

Azure integration tile in Datadog, set to the 'Configuration' tab. Underneath 'App Registrations' are a number of 'Client ID' headings. The top one is expanded to show configuration options, including client ID, tenant ID, client secret. Underneath are textboxes to populate 'Metric Collection Filters', followed by toggles for 'Collect Custom Metrics', 'Monitor Automuting', and 'Resource Collection'.

These settings include:

Metric Collection Filters: Use these filters to limit the hosts and App Service Plans that are monitored by Datadog. This is useful to control your Datadog infrastructure costs. For details on how to set up these filters, see metric collection filters, below. For more information about billing, see the Azure Integration Billing Guide.

Collect Custom Metrics: Enable/disable the collection of custom metrics from Azure App Insights.

Note: Custom metrics collected with this option appear in Datadog under the namespace application_insights.custom.<METRIC_NAME>. These include all custom metrics from all App Insights instances within the scope of the integration. App Insights Standard metrics are included in Datadog automatically as standard metrics under the namespace azure.insights.. Additional Azure App Insights metrics are considered custom metrics in Datadog and may impact your costs.

Monitor Automuting: Enable/disable automuting for Azure VMs.

Resource Collection: Enable/disable Cloud Security Posture Management.

Metric collection filters

Enter a list of tags in the text box under Metric Collection Filters.

This list of tags in <KEY>:<VALUE> form is separated by commas and defines a filter used while collecting metrics. Wildcards such as ? (for single characters) and * (for multiple characters) can also be used.

Only VMs that match one of the defined tags are imported into Datadog. The rest are ignored. VMs matching a given tag can also be excluded by adding ! before the tag. For example:

datadog:monitored,env:production,!env:staging,instance-type:c1.*

Monitor the integration status

Once the integration is configured, Datadog begins running a continuous series of calls to Azure APIs to collect critical monitoring data from your Azure environment. Sometimes these calls return errors (for example, if the provided credentials have expired). These errors can inhibit or block Datadog’s ability to collect monitoring data.

When critical errors are encountered, the Azure integration generates events in the Datadog Events Explorer, and republishes them every five minutes. You can configure an Event Monitor to trigger when these events are detected and notify the appropriate team.

Datadog provides a recommended monitor you can use as a template to get started. To use the recommended monitor:

  1. In Datadog, go to Monitors -> New Monitor and select the Recommended Monitors tab.
  2. Select the recommended monitor titled [Azure] Integration Errors.
  3. Make any desired modifications to the search query or alert conditions. By default, the monitor will trigger whenever a new error is detected, and resolve when the error has not been detected for the past 15 minutes.
  4. Update the notification and re-notification messages as desired. Note that the events themselves contain pertinent information about the event and will be included in the notification automatically. This includes detailed information about the scope, error response, and common steps to remediate.
  5. Configure notifications through your preferred channels (email, Slack, PagerDuty, or others) to make sure your team is alerted about issues affecting Azure data collection.

Metrics collection

After the integration tile is set up, metrics are collected by a crawler. To collect additional metrics, deploy the Datadog Agent to your VMs:

Agent installation

You can use the Azure extension to install the Datadog Agent on Windows VMs, Linux x64 VMs, and Linux ARM-based VMs.

  1. In the Azure portal, navigate to your VM > Settings > Extensions > Add and select Datadog Agent.
  2. Click Create, enter your Datadog API key, and click OK.

To install the Agent based on operating system or CI and CD tool, see the Datadog Agent install instructions.

Note: Domain controllers are not supported when installing the Datadog Agent with the Azure extension.

Validation

It may take few minutes for metrics from applications under the new subscription to appear.

Navigate to the Azure VM Default Dashboard to see this dashboard populate with your infrastructure’s data:

Azure VM screenboard

Log collection

Note: The Azure Native integration (available for customers on Datadog’s US3 site) has different setup instructions. If you are using the Azure Native integration, change the site selector to see US3-specific instructions. For more details, see the Overview.

The best method for submitting logs from Azure to Datadog is with the Agent or DaemonSet. For some resources it may not be possible. In these cases, Datadog recommends creating a log forwarding pipeline using an Azure Event Hub to collect Azure Platform Logs. For resources that cannot stream Azure Platform Logs to an Event Hub, you can use the Blob Storage forwarding option.

To get started, click the button below and fill in the form on Azure Portal. The Azure resources required to get activity logs streaming into your Datadog account will be deployed for you.

Deploy to Azure

Alternatively, Datadog provides two automated scripts you can use.

The first script creates and configures the Azure resources required to get activity logs streaming into your Datadog account. These resources include Activity Log diagnostic settings, Azure Functions, Event Hub namespaces, and Event Hub.

The second script is a more generic option that deploys only the Event Hub and Azure Function portions, without any diagnostic settings. This can be used to configure the streaming sources. In either case, the Event Hubs can be used by other streaming sources.

Example:

If you want to stream both activity logs and resource logs from westus, run the first script including the optional parameter -ResourceGroupLocation westus (activity logs are a subscription-level source, so you can create your pipeline for them in any region). Once this is deployed, you can send resource logs through the same Event Hub by adding diagnostic settings on your resources in westus.

Note:

This integration does not collect events.

Sending activity logs from Azure to Datadog

Step 1: In the Azure portal, navigate to your Cloud Shell.

azure cloud shell

Step 2: Run the command below to download the automation script into your Cloud Shell environment.

Activity Logs Step 1

(New-Object System.Net.WebClient).DownloadFile("https://raw.githubusercontent.com/DataDog/datadog-serverless-functions/master/azure/eventhub_log_forwarder/activity_logs_deploy.ps1", "activity_logs_deploy.ps1")

You can also view the contents of the script.

Step 3: Invoke the script by running the command below, while replacing <api_key>, with your Datadog API token, and <subscription_id>, with your Azure Subscription ID. You can also add other optional parameters to configure your deployment. See Optional Parameters.

Activity Logs Step 2

./activity_logs_deploy.ps1 -ApiKey <api_key> -SubscriptionId <subscription_id> 

Sending Azure Platform logs to Datadog

For a generic solution for sending Azure Platform Logs (including resource logs), you can also deploy just the Event Hub and log forwarder. After you deploy this pipeline, you can create diagnostic settings for each of the log sources, configuring them to stream to Datadog.

Step 1: In the Azure portal, navigate to your Cloud Shell.

Step 2: Run the command below to download the automation script into your Cloud Shell environment.

Platform Logs Step 1

(New-Object System.Net.WebClient).DownloadFile("https://raw.githubusercontent.com/DataDog/datadog-serverless-functions/master/azure/eventhub_log_forwarder/resource_deploy.ps1", "resource_deploy.ps1")

You can also view the contents of the script.

Step 3: Invoke the script by running the command below, while replacing <api_key>, with your Datadog API token, and <subscription_id>, with your Azure Subscription ID. You can also add other optional parameters to configure your deployment. See Optional Parameters.

Platform Logs Step 2

./resource_deploy.ps1 -ApiKey <api_key> -SubscriptionId <subscription_id> 

Step 4: Create diagnostic settings for all Azure resources that is sending logs to Datadog. Configure these diagnostic settings to start streaming to the Event Hub you just created.

Note: Resources can only stream to Event Hubs in the same Azure region, so you need to replicate step 2 for each region you want to stream resource logs from.

Note: All of the Azure resources deployed for the Platform Logs pipeline contain its Resource-Group-Location appended to its default name. For example, datadog-eventhub-westus. However, you can alter this convention by overriding the parameter.

Optional parameters

Note: Ensure that your custom resource names are unique when you customize the parameters. Validate that the resource name does not already exist within your list of other Azure resources.

-Flag <Default Parameter>Description
-DatadogSite <datadoghq.com>Customize your Datadog instance by adding this flag with another Datadog site as a parameter. Your Datadog site is: .
-Environment <AzureCloud>Manage storage in Azure independent clouds by adding this flag as a parameter. Additional options are AzureChinaCloud, AzureGermanCloud, and AzureUSGovernment.
-ResourceGroupLocation <westus2>You can choose the region in which your Azure resource group and resources are getting deployed by adding this flag with an updated Azure region.
-ResourceGroupName <datadog-log-forwarder-rg>Customize the name of your Azure resource group by adding this flag with an updated parameter.
-EventhubNamespace <datadog-ns-4c6c53b4-1abd-4798-987a-c8e671a5c25e>Customize your Azure Event Hub namespace by adding this flag with an updated parameter. By default, datadog-ns-<globally-unique-ID> is generated.
-EventhubName <datadog-eventhub>Customize the name of your Azure Event Hub by adding this flag with an updated parameter.
-FunctionAppName <datadog-functionapp-1435ad2f-7c1f-470c-a4df-bc7289d8b249>Customize the name of your Azure function app by adding this flag with an updated parameter. By default, datadog-functionapp-<globally-unique-ID> is generated.
-FunctionName <datadog-function>Customize the name of your Azure Function by adding this flag with an updated parameter.
-DiagnosticSettingName <datadog-activity-logs-diagnostic-setting>Customize the name of your Azure diagnostic setting by adding this flag with an updated parameter. (Only relevant for sending activity logs)

Installation errors? See the troubleshooting section to quickly solve some common error cases.

To send logs from Azure to Datadog, follow this general process:

  1. Create an Azure Event Hub.
  2. Setup the Datadog-Azure function with an Event hub trigger to forward logs to Datadog.
  3. Configure your Azure services to stream logs to the Event Hub by creating a diagnostic setting.

The instructions below walk through a basic, initial setup using the Azure Portal. All of these steps can be performed with the CLI, Powershell, or resource templates by referring to the Azure documentation.

Azure Event Hub

Create an Azure Event Hub:

Create a new namespace or add a new Event Hub to an existing namespace by following the instructions below.

  1. In the Azure portal, navigate to the Event Hubs overview and click Create.
  2. Enter the name, pricing tier, subscription, and resource group.
  3. Select Location. Note: The Event Hub must be in the same Location as the resource you want to submit logs from. For activity logs or other account-wide log sources, you can choose any region.
  4. Select your desired options for throughput units, availability-zones, and auto-inflation.
  5. Click Create.

Add an Event Hub to your Event Hub namespace.

  1. In the Azure portal, navigate to a new or existing namespace.
  2. Click + Event Hub.
  3. Select your desired options for name, partition-count, and message-retention.
  4. Click Create.

Datadog Azure function

Set up the Datadog-Azure Function with an Event Hub trigger to forward logs to Datadog:

Create a new Function App or use an existing Function App and skip to the next section.

  1. In the Azure portal, navigate to the Function Apps overview and click Create.
  2. Select a subscription, resource group, region, and enter a name for your function app.
  3. Select Publish to Code, Runtime stack to Node.js, and Version to 16 LTS.
  4. Select an operating system and plan type.
  5. Click Next:Hosting.
  6. Select a storage account.
  7. Review and create the new function app.
  8. Wait for your deployment to finish.

Add a new function to your Function App using the Event Hub trigger template.

  1. Select a new/existing function app from the function apps list.
  2. Select Functions from the functions menu and click Create.
  3. Select Azure Event Hub trigger from the templates menu.
  4. Under Event Hub connection, select your namespace and Event Hub.
  5. Click Create.

Point your Event Hub trigger to Datadog.

  1. Select your new Event Hub trigger from the functions view.
  2. Click on Code + Test under the developer side menu.
  3. Add the Datadog-Azure Function code to your index.js file.
  4. Add your API key by creating a DD_API_KEY environment variable under the configuration tab of your function app, or copy it into the function code by replacing <DATADOG_API_KEY> on line 22.
  5. If you’re not using the Datadog US1 site, set your Datadog site with a DD_SITE environment variable under the configuration tab of your function app, or copy the site parameter into the function code on line 23.
  6. Save the function.
  7. Click on Integration then Azure Event Hubs under trigger and check the following settings:
    a. Event Parameter Name is set to eventHubMessages.
    b. Event Hub Cardinality is set to Many.
    c. Event Hub Data Type is left empty.
  8. Click Save.
  9. Verify your setup is correct by running the function and then checking the Datadog log explorer for the test message.
    Note: The test log event must be in valid JSON format.

Activity logs

  1. In the Azure portal, navigate to the Activity Log.
  2. Click on Diagnostic Settings.
  3. Click Add diagnostic setting.
  4. Under category details, select the categories of logs you want to send to Datadog.
  5. Under destination details, select Stream to an event hub.
  6. Set the Event Hub namespace and name. These should match the Event Hub namespace and name that you used to create your Event Hub trigger.
  7. Set the shared access key. This key should be configured with send or manage access.
  8. Click Save.
  9. Verify your setup is correct by checking the Datadog log explorer for logs from this resource.

Resource logs

Configure your Azure services to forward their logs to the Event Hub by creating a diagnostic setting.

  1. In the Azure portal, navigate to the resource of the logs you want to send to Datadog.
  2. Under the monitoring section of the resource blade, click Diagnostic settings.
  3. Click Add diagnostic setting.
  4. Under category details, select the categories of logs you want to send to Datadog.
  5. Under destination details, select Stream to an event hub.
  6. Set the Event Hub namespace and name. These should match the Event Hub namespace and name that you used to create your Event Hub trigger.
  7. Set the shared access key. This key should be configured with send or manage access.
  8. Click Save.
  9. Verify your setup is correct by checking the Datadog log explorer for logs from this resource.

To collect logs from all of your Azure App Services, follow this general process:

  1. Set up Azure Blob Storage from the Azure portal, Azure Storage Explorer, Azure CLI, or Powershell.
  2. Set up the Datadog-Azure Function which forwards logs from your blob storage to Datadog.
  3. Configure your Azure App Services to forward their logs to the Blob Storage.

Create a new Azure Blob Storage function

If you are unfamiliar with Azure functions, see Create your first function in the Azure portal.

  1. In the Azure portal, navigate to the Function Apps overview and click Create.
  2. Select a subscription, resource group, region, and enter a name for your function apps.
  3. Select Publish to Code, Runtime stack to Node.js, and Version to 16 LTS.
  4. Select Operating System Windows and a plan type.
  5. Click Next:Hosting.
  6. Select a storage account.
  7. Review and Create the new function.
  8. Once deployment has finished, select your new function from the function apps list.
  9. Select to build your function In-portal and use the Blog Storage trigger template (under More templates…). If prompted, install the Microsoft.Azure.WebJobs.Extensions.EventHubs extension.
  10. Select or add your Storage account connection and click Create.
  11. Create an index.js file and add the Datadog-Azure Function code (replace <DATADOG_API_KEY> with your Datadog API Key).
  12. Save the function.
  13. Under Integrate, set the Blob Parameter Name to blobContent and click Save.
  14. Verify your setup is correct by checking the Datadog Log explorer for your logs.

Setup

Log Archiving

Archiving logs to Azure Blob Storage requires an App Registration even if you are using the Azure Native integration. To archive logs to Azure Blob Storage, follow the setup instructions to configure the integration using an App Registration. App Registrations created for archiving purposes do not need the Monitoring Reader role assigned.

Once you have an App Registration configured, you can create a log archive that writes to Azure Blob Storage.

Note: If your storage bucket is in a subscription being monitored through the Azure Native integration, a warning is displayed in the Azure Integration Tile about the App Registration being redundant. You can ignore this warning.

Prerequisites

Required permissions

To set up the Azure Native integration, you must be an Owner on any Azure subscriptions you want to link, and Admin for the Datadog org you are linking them to. Ensure you have the appropriate access before starting the setup.

SSO configuration

(Optional): You can configure single sign-on (SSO) during the process of creating a new Datadog organization in Azure. You can also configure SSO later. To configure SSO during the initial creation, first create a Datadog enterprise gallery app.

Installation

Configuring the Azure integration requires the creation of a Datadog resource in Azure. These resources represent the connection or link between your Datadog organization and your Azure environment. You can configure a Datadog resource to link as many subscriptions as you wish to monitor. The same settings (such as host filters and log collection rules) are applied across all subscriptions linked. To apply different settings to different subscriptions, create different Datadog resources.

There are two options when you create a Datadog resource in Azure:

  1. Link to an existing Datadog organization. This is the more common action. Use this to configure your Datadog org to monitor an Azure subscription that hasn’t been linked yet. This action does not affect your Datadog billing plan.

  2. Create a new Datadog organization. This flow is less common. Use this if you do not yet have a Datadog org and you want to get started with a paid plan through Azure Marketplace. This flow creates a brand new Datadog org, allows you to select a billing plan, and links the associated Azure subscription for monitoring.

Note: Trials are not available through the Create a new Datadog organization option in Azure. To get started with a free trial, first create a trial Datadog org on our US3 site. Then use the linking flow to add any subscriptions you want to monitor.

Once you create a Datadog resource, data collection begins for the associated subscription. See details for using this resource to configure, manage, and deploy Datadog in our Guide.

Create Datadog resource

To start monitoring an Azure subscription, navigate to the Datadog Service page in Azure and select the option to create a new Datadog resource:

Azure US3 Datadog Service

Choose Link Azure subscription to an existing Datadog organization or Create a new Datadog organization. Linking is the more common action. Use this to configure monitoring for an Azure subscription that hasn’t been linked yet. Only choose the Create flow if you are not yet a Datadog customer and want to get started with a new, paid plan.

Azure US3 create a Datadog resource

Note: New Datadog organizations created through the Azure portal automatically have billing consolidated into their Azure invoice. This usage counts towards your organization’s MACC if applicable.

Configuration

After selecting to link to an existing Datadog organization, the portal displays a form for creating the Datadog resource:

Link Azure subscription to an existing Datadog organization

Provide the following values:

PropertyDescription
SubscriptionThe Azure subscription in which to create the Datadog resource. You must have owner access on this subscription. This subscription is monitored, and you can monitor additional subscriptions with this Datadog resource after creation.
Resource groupCreate a new resource group or use an existing one. A resource group is a container that holds related resources for an Azure solution.
Resource nameSpecify a name for the Datadog resource. The recommended naming convention is: subscription_name-datadog_org_name.
LocationThe location is West US2—this is the location where Datadog’s US3 site is hosted in Azure. This has no impact on your use of Datadog. Like all Datadog sites, the US3 site is entirely SaaS and supports monitoring all Azure regions as well as other cloud providers and on-premises hosts.
Datadog organizationAfter the authentication step is completed, the Datadog organization name is set to the name of the Datadog organization being linked. The Datadog site is set to US3.

Click Link to Datadog organization to open a Datadog authentication window, then sign in to Datadog.

By default, Azure links your current Datadog organization to your Datadog resource. If you want to link to a different organization, select the appropriate organization in the authentication window:

Azure US3 select Datadog organization

When the oauth flow is complete, verify the Datadog organization name is correct.

After you complete the basic configuration, select Next: Metrics and logs.

Basics

After selecting to create a new Datadog organization, the portal displays a form for creating both the Datadog resource and the new Datadog organization:

Azure US3 create a Datadog resource

Provide the following values:

PropertyDescription
SubscriptionThe Azure subscription in which to create the Datadog resource. You must have owner access on this subscription. This subscription is monitored, and you can monitor additional subscriptions with this Datadog resource after creation.
Resource groupCreate a new resource group or use an existing one. A resource group is a container that holds related resources for an Azure solution.
Resource nameThe name for the Datadog resource. This name is assigned to the new Datadog organization.
LocationThe location is West US2—this is the location where Datadog’s US3 site is hosted in Azure. This has no impact on your use of Datadog. Like all Datadog sites, the US3 site is entirely SaaS and supports monitoring all Azure regions as well as other cloud providers and on-premises hosts.
Datadog organizationThe Datadog organization name is set to the resource name, and the Datadog site is set to US3.
Pricing planA list of the available Datadog pricing plans. If you have a private offer, it iss available in this dropdown.
Billing termMonthly.

Metrics and logs

Metric collection

By default, metrics for all Azure resources within the subscription are collected automatically. To send all metrics to Datadog, there is no action needed.

Tag rules for sending metrics

Optionally, limit metric collection for Azure VMs and App Service Plans using Azure tags attached to your resources.

  • Virtual machines, virtual machine scale sets, and App Service Plans with include tags send metrics to Datadog.
  • Virtual machines, virtual machine scale sets, and App Service Plans with exclude tags don’t send metrics to Datadog.
  • If there’s a conflict between inclusion and exclusion rules, exclusion takes priority.
  • There is no option to limit metric collection for other resources.
Log collection

There are three types of logs that can be emitted from Azure to Datadog.

Subscription level logs provide insight into the operations on your resources at the control plane. Updates on service health events are also included. Use the activity log to determine the what, who, and when for any write operations (PUT, POST, DELETE).

To send subscription level logs to Datadog, select Send subscription activity logs. If this option is left unchecked, none of the subscription level logs are sent to Datadog.

Azure resource logs provide insight into operations taken on Azure resources at the data plane. For example, getting a secret from a key vault or making a request to a database are data plane operations. The content of resource logs varies by the Azure service and resource type.

To send Azure resource logs to Datadog, select Send Azure resource logs for all defined resources. The types of Azure resource logs are listed in the Azure Monitor Resource Log categories. When this option is selected, all resource logs are sent to Datadog, including any new resources created in the subscription.

You can optionally filter the set of Azure resources sending logs to Datadog using Azure resource tags.

Tag rules for sending logs
  • Azure resources with include tags send logs to Datadog.
  • Azure resources with exclude tags don’t send logs to Datadog.
  • If there’s a conflict between inclusion and exclusion rules, exclusion takes priority.

For example, the screenshot below shows a tag rule where only those virtual machines, virtual machine scale sets, and app service plans tagged as Datadog = True send metrics and logs to Datadog.

Azure US3 create a Datadog resource logs

Azure Active Directory (Azure AD) logs contain the history of sign-in activity and an audit trail of changes made in Azure AD for a particular tenant. To send these logs to Datadog, first complete the process to create a Datadog resource. Once you have a Datadog resource in Azure, follow the setup steps in the Datadog in the Azure Portal guide.

Once you have completed configuring metrics and logs, select Next: Security.

Security

Cloud Security Posture Management (CSPM) makes it easier to assess and visualize the current and historic security posture of your cloud environment, automate audit evidence collection, and catch misconfigurations that leave your organization vulnerable to attacks.

To enable CSPM, select Enable Datadog Cloud Security Posture Management. This enables Datadog CSPM for any subscriptions associated with the Datadog resource.

Once you have completed configuring CSPM, select Next: Single sign-on.

Single sign-on

(Optional) If you use Azure Active Directory as your identity provider, activate single sign-on from the Azure portal to Datadog.

If you’re linking the Datadog resource to an existing Datadog organization, you can’t set up single sign-on at this step. Instead, set up single sign-on after creating the Datadog resource. For more information, see Reconfigure single sign-on.

Azure US3 create a Datadog resource single sign-on

To establish single sign-on through Azure Active Directory, select the checkbox for Enable single sign-on through Azure Active Directory.

The Azure portal retrieves the appropriate Datadog application(s) from Azure Active Directory. Datadog Enterprise app(s) created prior to starting the Datadog resource creation process are available here.

Select the Datadog application you wish to use. If you haven’t created one, see the documentation on creating an Azure AD Enterprise Gallery app.

Azure US3 enable single sign-on

Select Next: Tags.

Tags

(Optional) Set up custom tags for your new Datadog resource. Provide name and value pairs for the tags to apply to the Datadog resource.

Azure US3 create a Datadog resource add tags

When you’ve finished adding tags, select Next: Review + create.

Review + create

Review your selections and the terms of use. After validation completes, select Create. Azure then deploys the Datadog resource. This resource links the subscription to your Datadog account, and enables a number of capabilities for ongoing management of the integration. For details and instructions, see our guide to managing Datadog in the Azure portal.

Azure US3 create a Datadog resource validation

After the deployment process is complete, select Go to Resource to see your Datadog resource.

Azure US3 Datadog deploy complete

Access Datadog

After your Datadog resource is created, access the associated Datadog organization. Access is dependent on whether you created a new organization or linked an existing organization.

SSO

If you created a new Datadog organization with SSO configured, use the link in the Datadog resource blade to login. This is a SAML link that logs you in directly to your Datadog org from the Datadog resource in Azure.

Azure US3 access Datadog

No SSO

If you created a new Datadog organization without SSO configured, use the Datadog organization link in the overview blade to set your Datadog password. After your Datadog password is set, the link is a standard Datadog URL.

If you linked to an existing Datadog organization, there is no change to the way you access your Datadog organization.

Monitor the integration status

Once the integration is configured, Datadog begins running a continuous series of calls to Azure APIs to collect critical monitoring data from your Azure environment. Sometimes these calls return errors (for example, if the provided credentials have expired). These errors can inhibit or block Datadog’s ability to collect monitoring data.

When critical errors are encountered, the Azure integration generates events in the Datadog Events Explorer, and republishes them every five minutes. You can configure an Event Monitor to trigger when these events are detected and notify the appropriate team.

Datadog provides a recommended monitor you can use as a template to get started. To use the recommended monitor:

  1. In Datadog, go to Monitors -> New Monitor and select the Recommended Monitors tab.
  2. Select the recommended monitor titled [Azure] Integration Errors.
  3. Make any desired modifications to the search query or alert conditions. By default, the monitor will trigger whenever a new error is detected, and resolve when the error has not been detected for the past 15 minutes.
  4. Update the notification and re-notification messages as desired. Note that the events themselves contain pertinent information about the event and will be included in the notification automatically. This includes detailed information about the scope, error response, and common steps to remediate.
  5. Configure notifications through your preferred channels (email, Slack, PagerDuty, or others) to make sure your team is alerted about issues affecting Azure data collection.

SAML SSO configuration

To use Security Assertion Markup Language (SAML) single sign-on (SSO) within the Datadog resource, you must set up an enterprise application.

To add an enterprise application, you need the role of global administrator, cloud application administrator, application administrator, or owner of the service principal.

Use the following steps to set up the enterprise application:

  1. Go to Azure portal and select Azure Active Directory.

  2. In the left pane, select Enterprise applications.

  3. Select New Application.

  4. In Add from the gallery, search for Datadog. Select the search result, then select Add.

    Add Datadog application from the gallery
  5. Once the app is created, go to Properties on the side panel. Set User assignment required? to No, and select Save.

    User assignment required - set to no
  6. Go to Single sign-on on the side panel, then select SAML.

    SSO - SAML
  7. Select Yes when prompted to save your single sign-on settings.

    Azure US3 Basic SAML Configuration
  8. The setup of single sign-on is complete.

Programmatic Management

Configuration for the Azure Native integration can be done in the Azure Portal or programmatically. If you prefer programmatic options, you can also leverage:

If you have many subscriptions you want to monitor with the Azure Native integration, Datadog recommends using Terraform to create the Datadog resources. To learn about configuring Terraform across multiple subscriptions, see this blog post about Deploying to multiple Azure subscriptions using Terraform.

Data Collected

Metrics

All standard Azure Monitor metrics plus unique Datadog generated metrics.

For a detailed list of metrics, select the appropriate Azure service in the overview section.

Events

The Azure integration automatically collects Azure Service Health events. To view these in Datadog, navigate to the Event explorer and filter for the Azure Service Health namespace.

Service Checks

The Azure integration does not include any service checks.

Tags

Azure integration metrics, events, and service checks receive the following tags:

IntegrationNamespaceDatadog Tag Keys
All Azure integrationsAllcloud_provider, region, kind, type, name, resource_group, tenant_name, subscription_name, subscription_id, status (if applicable)
Azure VM integrationsazure.vm.*host, size, operating_system, availability_zone
Azure App Service Plansazure.web_serverfarms.*per_site_scaling, plan_size, plan_tier, operating_system
Azure App Services Web Apps & Functionsazure.app_services.*, azure.functions.*operating_system, server_farm_id, reserved, usage_state, fx_version (linux web apps only), php_version, dot_net_framework_version, java_version, node_version, python_version
Azure SQL DBazure.sql_servers_databases.*license_type, max_size_mb, server_name, role, zone_redundant.
For replication Links only: state primary_server_name primary_server_region secondary_server_name secondary_server_region
Azure Load Balancerazure.network_loadbalancers.*sku_name
Azure Usage and Quotaazure.usage.*usage_category, usage_name

Troubleshooting

See the Azure Troubleshooting guide.

Still need help? Contact Datadog support.

Further Reading