Microsoft Azure

Microsoft Azure

Crawler Crawler

Overview

Datadog’s Azure integration enables the collection of metrics and logs from your Azure environment. The configuration options are different depending on which Datadog site your organization is using:

All Sites: All Datadog sites can use the App Registration credential process for implementing metric collection, and the Event Hub setup for sending Azure Platform Logs.

US3: If you are using the Datadog US3 site, use workflows embedded natively in Azure to streamline collection of metrics and Azure Platform Logs. Datadog recommends using this method when possible. This entails creating a Datadog resource in Azure to link your Azure subscription(s) to your Datadog organization.

Select site US3 in the side panel of this page or click this link to ensure you're seeing the US3 version of the documentation.

Connect to Microsoft Azure to:

  • Get metrics from Azure VMs with or without installing the Datadog Agent.
  • Collect standard Azure Monitor metrics for all Azure services: Application Gateway, App Service (Web & Mobile), Batch Service, Event Hub, IoT Hub, Logic App, Redis Cache, Server Farm (App Service Plan), SQL Database, SQL Elastic Pool, Virtual Machine Scale Set, and many more.
  • Tag your Azure metrics with Azure-specific information about the associated resource, such as region, resource group, and custom Azure tags.
  • Get Datadog generated metrics to provide unique insights into your Azure environment.
  • Correlate data from your Azure applications across logs, metrics, APM tracing, user activity, and more within your Datadog organization.
Datadog's Azure integration is built to collect all metrics from Azure Monitor. Datadog strives to continually update the docs to show every sub-integration, but cloud services rapidly release new metrics and services so the list of integrations can sometimes lag.
The azure.*.status and azure.*.count metrics are generated by Datadog from Azure Resource Health. For more information, see Azure Status and Count Metrics.
IntegrationDescription
Analysis ServicesA service that provides data models in the cloud.
API ManagementA service to publish, secure, transform, maintain, and monitor APIs.
App ServiceA service for deploying and scaling web, mobile, API and business logic applications
App Service EnvironmentA service that provides an environment for securely running App Service apps at high scale.
App Service PlanA set of compute resources for a web app to run.
Application GatewayA web traffic load balancer that enables you to manage traffic to your web applications.
AutomationA service that provides automation and configuration management across your environments.
Batch ServiceManaged task scheduler and processor
Cognitive ServicesAPIs, SDKs, and services available to help build applications without AI or data science knowledge.
Container InstancesA service to deploy containers without the need to provision or manage the underlying infrastructure.
Container ServiceA production-ready Kubernetes, DC/OS, or Docker Swarm cluster.
Cosmos DBA database service that supports document, key-value, wide-column, and graph databases.
Customer InsightsEnables organizations to bring together data sets to build a 360° view of their customers.
Data ExplorerFast and highly scalable data exploration service.
Data FactoryA service to compose data storage, movement, and processing services into automated data pipelines.
Data Lake AnalyticsAn analytics job service that simplifies big data.
Data Lake StoreA no limits data lake that powers big data analytics.
Database for MariaDBA service that provides fully managed, enterprise-ready community MariaDB database.
Event GridAn event routing service that allows for uniform event consumption using a publish-subscribe model.
Event HubLarge scale data stream managed service
ExpressRouteA service to extend your on-premises networks into the cloud.
FirewallCloud-native network security to protect your Azure Virtual Network resources.
FunctionsA service for running serverless code in response to event triggers.
HDInsightsA cloud service that processes massive amounts of data.
IOT HubConnect, monitor, and manage billions of IOT assets
Key VaultA service to safeguard and manage cryptographic keys and secrets used by cloud applications and services.
Load BalancerScale your applications and create high availability for your services
Logic AppBuild powerful integration solutions
Machine LearningEnterprise-grade machine learning service to build and deploy models faster.
Network InterfacesEnables VM communication with internet, Azure, and on-premises resources.
Notification HubsA push engine that allows you to send notifications to any platform from any backend.
Public IP AddressA resource that enables inbound communication and outbound connectivity from the Internet.
Redis CacheManaged data cache
RelaySecurely expose services that run in your corporate network to the public cloud.
Cognitive SearchA search-as-a-service cloud solution that provides tools for adding a rich search experience.
StorageStorage for blobs, files, queues, and tables
Stream AnalyticsAn event-processing engine to examine high volumes of data streaming from devices.
SQL DatabaseHighly scalable relational database in the cloud
SQL Database Elastic PoolManage the performance of multiple databases
Usage and QuotasFollow your Azure usage.
Virtual MachineVirtual machine management service
Virtual Machine Scale SetDeploy, manage, and autoscale a set of identical VMs
Virtual NetworkAllow Azure resources to securely communicate with each other, the internet, and on-premises networks.

Setup

Installation

Integrate your Microsoft Azure account with Datadog using the Azure CLI tool or the Azure portal. This integration method works automatically for all Azure Clouds: Public, China, German, and Government.

Follow the instructions below and Datadog detects automatically which Cloud you are using to complete the integration.

Integrating through the Azure CLI

To integrate Datadog with Azure using the Azure CLI, you must have Azure CLI installed.

First, log in to the Azure account you want to integrate with Datadog:

az login

Run the account show command:

az account show

Enter the generated Tenant ID value in the Datadog Azure Integration tile under Tenant name/ID.

Create an application as a service principal using the format:

az ad sp create-for-rbac --role "Monitoring Reader" --scopes /subscriptions/{subscription_id}
  • This command grants the Service Principal the monitoring reader role for the subscription you would like to monitor.
  • The appID generated from this command must be entered in the Datadog Azure Integration tile under Client ID.
  • Add --name <CUSTOM_NAME> to use a hand-picked name, otherwise Azure generates a unique one. The name is not used in the setup process.
  • Add --password <CUSTOM_PASSWORD> to use a hand-picked password. Otherwise Azure generates a unique one. This password must be entered in the Datadog Azure Integration tile under Client Secret.

First, log in to the Azure account you want to integrate with Datadog:

azure login

Run the account show command:

az account show

Enter the generated Tenant ID value in the Datadog Azure Integration tile under Tenant name/ID.

Create a name and password:

azure ad sp create -n <NAME> -p <PASSWORD>
  • The <NAME> is NOT used but is required as part of the setup process.
  • The <PASSWORD> you choose must be entered in the Datadog Azure Integration tile under Client Secret.
  • The Object Id returned from this command is used in place of <OBJECT_ID> in the next command.

Create an application as a service principal using the format:

azure role assignment create --objectId <OBJECT_ID> -o "Monitoring Reader" -c /subscriptions/<SUBSCRIPTION_ID>/
  • This command grants the Service Principal the monitoring reader role for the subscription you would like to monitor.
  • The Service Principal Name generated from this command must be entered in the Datadog Azure Integration tile under Client ID.
  • <SUBSCRIPTION_ID> is the Azure subscription you would like to monitor, and is listed as ID with azure account show or in the portal.

First, log in to the Azure account you want to integrate with Datadog:

azure login

Run the account show command:

az account show

Enter the generated Tenant ID value in the Datadog Azure Integration tile under Tenant name/ID.

Create a name, home-page, identifier-uris, and password:

azure ad app create --name "<NAME>" --home-page "<URL>" --identifier-uris "<URL>" --password "<PASSWORD>"
  • The name, home-page, and identifier-uris are NOT used but are required as part of the setup process.
  • The password you choose must be entered in the Datadog Azure Integration tile under Client Secret.
  • The AppId returned from this command is used in the next command and must be entered in the Datadog Azure Integration tile under Client ID.

Create a Service Principal using:

Azure cli <0.10.2:

azure ad sp create {app-id}

Azure cli >= 0.10.2:

azure ad sp create -a {app-id}
  • The Object Id returned from this command is used in place of <OBJECT_ID> in the next command.

Create an Active Directory application using the format:

azure role assignment create --objectId <OBJECT_ID> --roleName "Monitoring Reader" --subscription <SUBSCRIPTION_ID>
  • This command grants the Service Principal the monitoring reader role for the subscription you would like to monitor.
  • <SUBSCRIPTION_ID> is the Azure subscription you would like to monitor, and is listed as ID with azure account show or in the portal.

Integrating through the Azure portal

  1. Create an app registration in your Active Directory and pass the correct credentials to Datadog.
  2. Give the application read-access to any subscriptions you would like to monitor.
Creating the app registration
  1. Under Azure Active Directory, navigate to App Registrations and click New registration.

  2. Enter the following and click the Create button. The name and sign-on URL are not used but are required for the setup process.

    • Name: Datadog Auth
    • Supported Account Types: Accounts in this organizational directory only (Datadog)
    • Redirect URI:
Giving read permissions to the application
  1. Navigate to Subscriptions through the search box or the left sidebar:
  1. Click on the subscription you would like to monitor.
  2. Select Access control (IAM) in the subscription menu and click Add > Add role assignment:
  1. For Role, select Monitoring Reader. Under Select, choose the name of the Application you just created:
  1. Click Save.
  2. Repeat this process for any additional subscriptions you want to monitor with Datadog. Note: Users of Azure Lighthouse can add subscriptions from customer tenants.

Note: Diagnostics must be enabled for ARM deployed VMs to collect metrics, see Enable diagnostics.

Completing the integration
  1. Under App Registrations, select the App you created, copy the Application ID and Tenant ID, and paste the values in the Datadog Azure Integration tile under Client ID and Tenant ID.
  2. For the same app, go to Manage > Certificates and secrets.
  3. Add a new Client Secret called datadogClientSecret, select a timeframe for Expires, and click Add:
  1. When the key value is shown, copy and paste the value in the Datadog Azure Integration tile under Client Secret and click Install Integration or Update Configuration.

Note: Your updates to the Azure configuration can take up to 20 minutes to be reflected in Datadog.

Configuration

Optionally, you can limit the Azure VMs that are pulled into Datadog by entering tags under Optionally filter to VMs with tag.

This list of tags in <KEY>:<VALUE> form is separated by commas and defines a filter used while collecting metrics. Wildcards such as ? (for single characters) and * (for multiple characters) can also be used.

Only VMs that match one of the defined tags are imported into Datadog. The rest are ignored. VMs matching a given tag can also be excluded by adding ! before the tag. For example:

datadog:monitored,env:production,!env:staging,instance-type:c1.*

Metrics collection

After the integration tile is set up, metrics are collected by a crawler. To collect additional metrics, deploy the Datadog Agent to your VMs:

Agent installation

  1. In the Azure portal, navigate to your VM > Settings > Extensions > Add and select Datadog Agent.
  2. Click Create, enter your Datadog API key, and click OK.

To install the Agent based on operating system or CI and CD tool, see the Datadog Agent install instructions.

Note: Domain controllers are not supported when installing the Datadog Agent with the Azure extension.

Validation

It may take few minutes for metrics from applications under the new subscription to appear.

Navigate to the Azure VM Default Dashboard to see this dashboard populate with your infrastructure’s data:

Log collection

The best method for submitting logs from Azure to Datadog is with the Agent or DaemonSet. For some resources it may not be possible. In these cases, Datadog recommends creating a log forwarding pipeline using an Azure Event Hub to collect Azure Platform Logs. For resources that cannot stream Azure Platform Logs to an Event Hub, you can use the Blob Storage forwarding option.

Datadog provides two automated scripts you can use.

The first script creates and configures the Azure resources required to get activity logs streaming into your Datadog account. These resources include Activity Log diagnostic settings, Azure Functions, Event Hub namespaces, and Event Hub.

The second script is a more generic option that deploys only the Event Hub and Azure Function portions, without any diagnostic settings. This can be used to configure the streaming sources. In either case, the Event Hubs can be used by other streaming sources.

Example:

If you want to stream both activity logs and resource logs from ‘westus’, run the first script including the optional parameter ‘-ResourceGroupLocation westus’ (activity logs are a subscription-level source, so you can create your pipeline for them in any region). Once this is deployed, you can send resource logs through the same Event Hub by adding diagnostic settings on your resources in ‘westus’.

Note:

This integration does not collect events.

Sending activity logs from Azure to Datadog

Step 1: In the Azure portal, navigate to your Cloud Shell.

Step 2: Run the command below to download the automation script into your Cloud Shell environment.

Activity Logs Step 1

(New-Object System.Net.WebClient).DownloadFile("https://raw.githubusercontent.com/DataDog/datadog-serverless-functions/master/azure/eventhub_log_forwarder/activity_logs_deploy.ps1", "activity_logs_deploy.ps1")

You can also view the contents of the script.

Step 3: Invoke the script by running the command below, while replacing <api_key>, with your Datadog API token, and <subscription_id>, with your Azure Subscription ID. You can also add other optional parameters to configure your deployment. See Optional Parameters.

Activity Logs Step 2

./activity_logs_deploy.ps1 -ApiKey <api_key> -SubscriptionId <subscription_id> 

Sending Azure Platform logs to Datadog

For a generic solution for sending Azure Platform Logs (including resource logs), you can also deploy just the Event Hub and log forwarder. After you deploy this pipeline, you can create diagnostic settings for each of the log sources, configuring them to stream to Datadog.

Step 1: In the Azure portal, navigate to your Cloud Shell.

Step 2: Run the command below to download the automation script into your Cloud Shell environment.

Platform Logs Step 1

(New-Object System.Net.WebClient).DownloadFile("https://raw.githubusercontent.com/DataDog/datadog-serverless-functions/master/azure/eventhub_log_forwarder/resource_deploy.ps1", "resource_deploy.ps1")

You can also view the contents of the script.

Step 3: Invoke the script by running the command below, while replacing <api_key>, with your Datadog API token, and <subscription_id>, with your Azure Subscription ID. You can also add other optional parameters to configure your deployment. See Optional Parameters.

Platform Logs Step 2

./resource_deploy.ps1 -ApiKey <api_key> -SubscriptionId <subscription_id> 

Step 4: Create diagnostic settings for all Azure resources that is sending logs to Datadog. Configure these diagnostic settings to start streaming to the Event Hub you just created.

Note: Resources can only stream to Event Hubs in the same Azure region, so you need to replicate step 2 for each region you want to stream resource logs from.

Note: All of the Azure resources deployed for the Platform Logs pipeline contain its Resource-Group-Location appended to its default name. For example: ‘datadog-eventhub-westus’. However, you can alter this convention by overriding the parameter.

Optional parameters

Note: Ensure that your custom resource names are unique when you customize the parameters. Validate that the resource name does not already exist within your list of other Azure resources.

-Flag <Default Parameter>Description
-DatadogSite <datadoghq.com>Customize your Datadog instance by adding this flag with another Datadog site as a parameter. Your Datadog site is: .
-Environment <AzureCloud>Manage storage in Azure independent clouds by adding this flag as a parameter. Additional options are AzureChinaCloud, AzureGermanCloud, and AzureUSGovernment.
-ResourceGroupLocation <westus2>You can choose the region in which your Azure resource group and resources are getting deployed by adding this flag with an updated Azure region.
-ResourceGroupName <datadog-log-forwarder-rg>Customize the name of your Azure resource group by adding this flag with an updated parameter.
-EventhubNamespace <datadog-ns-4c6c53b4-1abd-4798-987a-c8e671a5c25e>Customize your Azure Event Hub namespace by adding this flag with an updated parameter. By default, datadog-ns-<globally-unique-ID> is generated.
-EventhubName <datadog-eventhub>Customize the name of your Azure Event Hub by adding this flag with an updated parameter.
-FunctionAppName <datadog-functionapp-1435ad2f-7c1f-470c-a4df-bc7289d8b249>Customize the name of your Azure function app by adding this flag with an updated parameter. By default, datadog-functionapp-<globally-unique-ID> is generated.
-FunctionName <datadog-function>Customize the name of your Azure Function by adding this flag with an updated parameter.
-DiagnosticSettingName <datadog-activity-logs-diagnostic-setting>Customize the name of your Azure diagnostic setting by adding this flag with an updated parameter. (Only relevant for sending activity logs)

Installation errors? See the troubleshooting section to quickly solve some common error cases.

To send logs from Azure to Datadog, follow this general process:

  1. Create an Azure Event Hub.
  2. Setup the Datadog-Azure function with an Event hub trigger to forward logs to Datadog.
  3. Configure your Azure services to stream logs to the Event Hub by creating a diagnostic setting.

The instructions below walk through a basic, initial setup using the Azure Portal. All of these steps can be performed with the CLI, Powershell, or resource templates by referring to the Azure documentation.

Azure Event Hub

Create an Azure Event Hub:

Create a new namespace or add a new Event Hub to an existing namespace by following the instructions below.

  1. In the Azure portal, navigate to the Event Hubs overview and click Add.
  2. Enter the name, pricing tier, subscription, and resource group.
  3. Select Location. Note: The Event Hub must be in the same Location as the resource you want to submit logs from. For activity logs or other account-wide log sources, you can choose any region.
  4. Select your desired options for throughput units, availability-zones, and auto-inflation.
  5. Click Create.

Add an Event Hub to your Event Hub namespace.

  1. In the Azure portal, navigate to a new or existing namespace.
  2. Click + Event Hub.
  3. Select your desired options for name, partition-count, and message-retention.
  4. Click Create.

Datadog Azure function

Set up the Datadog-Azure Function with an Event Hub trigger to forward logs to Datadog:

Create a new function app or use an existing function app and skip to the next section.

  1. In the Azure portal, navigate to your Function Apps > Functions and click Add.
  2. Select a subscription, resource group, region, and enter a name for your function.
  3. Select Publish to Code, Runtime stack to Node.js, and Version to 12 LTS.
  4. Click Next:Hosting.
  5. Select a storage account, operating system, and plan type.
  6. Review and create the new function app.
  7. Wait for your deployment to finish.

Add a new function to your function app using the Event Hub trigger template.

  1. Select a new/existing function app from the function apps list.
  2. Select Functions from the functions menu and click Add.
  3. Select Azure Event Hub trigger from the templates menu and click New.
  4. Select your namespace and Event Hub for Event Hub connection and click OK.
  5. Click Create Function.

Point your Event Hub trigger to Datadog.

  1. Select your new Event Hub trigger from the functions view.
  2. Click on Code + Test under the developer side menu.
  3. Add the Datadog-Azure Function code to your index.js file.
  4. Add your API key by creating a DD_API_KEY environment variable under the configuration tab of your function app, or copy it into the function code by replacing <DATADOG_API_KEY> on line 22.
  5. Save the function.
  6. Click on Integration then Azure Event Hubs under trigger and check the following settings: a. Event Parameter Name is set to eventHubMessages. b. Event Hub Cardinality is set to Many. c. Event Hub Data Type is left empty.
  7. Click Save.
  8. Verify your setup is correct by running the function and then checking the Datadog log explorer for the test message.

Activity logs

  1. In the Azure portal, navigate to the Activity Log.
  2. Click on Diagnostic Settings.
  3. Click Add diagnostic setting.
  4. Under category details, select the categories of logs you want to send to Datadog.
  5. Under destination details, select Stream to an event hub.
  6. Set the Event Hub namespace and name. These should match the Event Hub namespace and name that you used to create your Event Hub trigger.
  7. Set the shared access key. This key should be configured with send or manage access.
  8. Click Save.
  9. Verify your setup is correct by checking the Datadog log explorer for logs from this resource.

Resource logs

Configure your Azure services to forward their logs to the Event Hub by creating a diagnostic setting.

  1. In the Azure portal, navigate to the resource of the logs you want to send to Datadog.
  2. Under the monitoring section of the resource blade, click Diagnostic settings.
  3. Click Add diagnostic setting.
  4. Under category details, select the categories of logs you want to send to Datadog.
  5. Under destination details, select Stream to an event hub.
  6. Set the Event Hub namespace and name. These should match the Event Hub namespace and name that you used to create your Event Hub trigger.
  7. Set the shared access key. This key should be configured with send or manage access.
  8. Click Save.
  9. Verify your setup is correct by checking the Datadog log explorer for logs from this resource.

To collect logs from all of your Azure App Services, follow this general process:

  1. Set up Azure Blob Storage from the Azure portal, Azure Storage Explorer, Azure CLI, or Powershell.
  2. Set up the Datadog-Azure Function which forwards logs from your blob storage to Datadog.
  3. Configure your Azure App Services to forward their logs to the Blob Storage.

Create a new Azure Blob Storage function

If you are unfamiliar with Azure functions, see Create your first function in the Azure portal.

  1. In the Azure portal, navigate to your Function Apps > Functions and click Add.
  2. Select a subscription, resource group, region, and enter a name for your function.
  3. Select Publish Code and Runtime stack Node.js.
  4. Click Next:Hosting.
  5. Select a storage account and plan type, then select Operating System Windows.
  6. Review and Create the new function.
  7. Once deployment has finished, select your new function from the Function Apps list.
  8. Select to build your function In-portal and use the Blog Storage trigger template (under More templates…). If prompted, install the Microsoft.Azure.WebJobs.Extensions.EventHubs extension.
  9. Select or add your Storage account connection and click Create.
  10. Create an index.js file and add the Datadog-Azure Function code (replace <DATADOG_API_KEY> with your Datadog API Key).
  11. Save the function.
  12. Under Integrate, set the Blob Parameter Name to blobContent and click Save.
  13. Verify your setup is correct by checking the Datadog Log explorer for your logs.

Prerequisites

Required permissions

To set up the Azure Datadog integration, you must have Owner access on the Azure subscription. Ensure you have the appropriate access before starting the setup.

SSO configuration

(Optional): You can configure single sign-on (SSO) during the process of creating a new Datadog organization in Azure. You can also configure SSO later. To configure SSO during the initial creation, first create a Datadog enterprise gallery app.

Installation

Configuring the Datadog Azure integration requires the creation of a Datadog resource in Azure. A Datadog resource in Azure represents the connection between a Datadog organization and an Azure subscription. Creating a Datadog resource happens in one of two scenarios:

  1. You are linking an Azure subscription to an existing Datadog organization.
  2. You are creating a Datadog organization through the Azure Marketplace, and simultaneously linking an Azure subscription to this Datadog organization.

A Datadog resource allows you to take the following actions within its associated Azure subscription:

  • Configure the collection of Azure metrics and platform logs
  • Deploy the Datadog VM Agent onto your Azure VMs
  • Deploy the Datadog .NET extension onto your Azure Web Apps
  • Configure single sign-on (SSO)
  • Verify the Azure resources sending metrics and logs
  • View details about the Datadog Agent status and configuration on your Azure VMs

Create Datadog resource

To create a new Datadog resource in Azure, navigate to the Datadog service page in Azure and select the option to create a new Datadog resource:

Choose “Create a new Datadog organization” or “Link Azure subscription to an existing Datadog organization”:

Note: New Datadog organizations created through the Azure portal automatically have billing consolidated into their Azure invoice. This usage counts towards your organization’s MACC if applicable.

Configuration

Basics

After selecting to create a new Datadog organization, the portal displays a form for creating both the Datadog resource and the new Datadog organization:

Provide the following values:

PropertyDescription
SubscriptionThe Azure subscription you want to monitor with Datadog. The Datadog resource exists in this subscription. You must have owner access.
Resource groupCreate a new resource group or use an existing one. A resource group is a container that holds related resources for an Azure solution.
Resource nameThe name for the Datadog resource. This name is assigned to the new Datadog organization.
LocationThe default location is West US 2. This has no impact on your use of Datadog. All Azure regions are supported, including other cloud providers and on-premise hosts.
Datadog organizationThe Datadog organization name is set to the resource name, and the Datadog site is set to US3.
Pricing planA list of the available Datadog pricing plans. If you have a private offer, it iss available in this dropdown.
Billing termMonthly.

After selecting to link to an existing Datadog organization, the portal displays a form for creating the Datadog resource:

Provide the following values:

PropertyDescription
SubscriptionThe Azure subscription you want to monitor with Datadog. The Datadog resource exists in this subscription. You must have owner access.
Resource groupCreate a new resource group or use an existing one. A resource group is a container that holds related resources for an Azure solution.
Resource nameSpecify a name for the Datadog resource. The recommended naming convention is: subscription_name-datadog_org_name.
LocationThe default location is West US2—the location where Datadog’s US3 site is hosted in Azure. This has no impact on your use of Datadog. All Azure regions are supported, including other cloud providers and on-premise hosts.
Datadog organizationThe Datadog organization name is set to the resource name and the Datadog site is set to US3.

Click Link to Datadog organization to open a Datadog authentication window, then sign in to Datadog.

By default, Azure links your current Datadog organization to your Datadog resource. If you want to link to a different organization, select the appropriate organization in the authentication window:

When the oauth flow is complete, verify the Datadog organization name is correct.

After you complete the basic configuration, select Next: Metrics and logs.

Metrics and logs

Metric collection

By default, metrics for all Azure resources within the subscription are collected automatically. To send all metrics to Datadog, there is no action needed.

Tag rules for sending metrics

Optionally, limit metric collection for Azure VMs and App Service Plans using Azure tags attached to your resources.

  • Virtual machines, virtual machine scale sets, and App Service Plans with include tags send metrics to Datadog.
  • Virtual machines, virtual machine scale sets, and App Service Plans with exclude tags don’t send metrics to Datadog.
  • If there’s a conflict between inclusion and exclusion rules, exclusion takes priority.
  • There is no option to limit metric collection for other resources.
Log collection

There are two types of logs that can be emitted from Azure to Datadog.

Subscription level logs provide insight into the operations on your resources at the control plane. Updates on service health events are also included. Use the activity log to determine the what, who, and when for any write operations (PUT, POST, DELETE).

To send subscription level logs to Datadog, select “Send subscription activity logs”. If this option is left unchecked, none of the subscription level logs are sent to Datadog.

Azure resource logs provide insight into operations taken on Azure resources at the data plane. For example, getting a secret from a key vault or making a request to a database are data plane operations. The content of resource logs varies by the Azure service and resource type.

To send Azure resource logs to Datadog, select “Send Azure resource logs for all defined resources”. The types of Azure resource logs are listed in the Azure Monitor Resource Log categories. When this option is selected, all resource logs are sent to Datadog, including any new resources created in the subscription.

You can optionally filter the set of Azure resources sending logs to Datadog using Azure resource tags.

Tag rules for sending logs
  • Azure resources with include tags send logs to Datadog.
  • Azure resources with exclude tags don’t send logs to Datadog.
  • If there’s a conflict between inclusion and exclusion rules, exclusion takes priority.

For example, the screenshot below shows a tag rule where only those virtual machines, virtual machine scale sets, and app service plans tagged as Datadog = True send metrics and logs to Datadog.

Once you have completed configuring metrics and logs, select Next: Single sign-on.

Single sign-on

(Optional) If you use Azure Active Directory as your identity provider, activate single sign-on from the Azure portal to Datadog.

If you’re linking the Datadog resource to an existing Datadog organization, you can’t set up single sign-on at this step. Instead, set up single sign-on after creating the Datadog resource. For more information, see Reconfigure single sign-on.

To establish single sign-on through Azure Active Directory, select the checkbox for “Enable single sign-on through Azure Active Directory”.

The Azure portal retrieves the appropriate Datadog application(s) from Azure Active Directory. Datadog Enterprise app(s) created prior to starting the Datadog resource creation process are available here.

Select the Datadog application you wish to use. If you haven’t created one, see the documentation on creating an Azure AD Enterprise Gallery app.

Select Next: Tags.

Tags

(Optional) Set up custom tags for your new Datadog resource. Provide name and value pairs for the tags to apply to the Datadog resource.

When you’ve finished adding tags, select Next: Review + create.

Review + create

Review your selections and the terms of use. After validation completes, select “Create”. Azure then deploys the Datadog resource.

After the deployment process is complete, select “Go to Resource” to see your Datadog resource.

Access Datadog

After your Datadog resource is created, access the associated Datadog organization. Access is dependent on whether you created a new organization or linked an existing organization.

SSO

If you created a new Datadog organization with SSO configured, use the link in the Datadog resource blade to login. This is a SAML link that logs you in directly to your Datadog org from the Datadog resource in Azure.

No SSO

If you created a new Datadog organization without SSO configured, use the Datadog organization link in the overview blade to set your Datadog password. After your Datadog password is set, the link is a standard Datadog URL.

If you linked to an existing Datadog organization, there is no change to the way you access your Datadog organization.

SAML SSO configuration

To use Security Assertion Markup Language (SAML) single sign-on (SSO) within the Datadog resource, you must set up an enterprise application.

To add an enterprise application, you need the role of global administrator, cloud application administrator, application administrator, or owner of the service principal.

Use the following steps to set up the enterprise application:

  1. Go to Azure portal and select “Azure Active Directory”.
  2. In the left pane, select “Enterprise applications”.
  3. Select “New Application”.
  4. In “Add from the gallery”, search for Datadog. Select the search result, then select “Add”.
  1. Once the app is created, go to “Properties” on the side panel. Set “User assignment required?” to No, and select “Save”.
  1. Go to “Single sign-on” on the side panel, then select SAML.
  1. Select “Yes” when prompted to save your single sign-on settings.
  1. The setup of single sign-on is complete.

Data Collected

Metrics

All standard Azure Monitor metrics plus unique Datadog generated metrics.

For a detailed list of metrics, select the appropriate Azure service in the overview section.

Events

The Azure integration does not collect events. Instead, the integration collects activity logs.

The Azure integration automatically collects Azure Service Health events. To view these in Datadog, navigate to the Event explorer and filter for the “Azure Service Health” namespace.

Service Checks

The Azure integration does not include any service checks.

Tags

Azure integration metrics, events, and service checks receive the following tags:

IntegrationNamespaceDatadog Tag Keys
All Azure integrationsAllcloud_provider, region, kind, type, name, resource_group, tenant_name, subscription_name, subscription_id, status (if applicable)
Azure VM integrationsazure.vm.*host, size, operating_system, availability_zone
Azure App Service Plansazure.web_serverfarms.*per_site_scaling, plan_size, plan_tier, operating_system
Azure App Services Web Apps & Functionsazure.app_services.*, azure.functions.*operating_system, server_farm_id, reserved, usage_state, fx_version (linux web apps only), php_version, dot_net_framework_version, java_version, node_version, python_version
Azure SQL DBazure.sql_servers_databases.*license_type, max_size_mb, server_name, role, zone_redundant.
For replication Links only: state primary_server_name primary_server_region secondary_server_name secondary_server_region
Azure Load Balancerazure.network_loadbalancers.*sku_name
Azure Usage and Quotaazure.usage.*usage_category, usage_name

Troubleshooting

See the Azure Troubleshooting guide.

Still need help? Contact Datadog support.

Further Reading