New announcements for Serverless, Network, RUM, and more from Dash! New announcements from Dash!

Google Cloud Platform

Crawler Crawler

Overview

Connect to Google Cloud Platform to see all your Google Compute Engine (GCE) hosts in Datadog. You can see your hosts in the infrastructure overview in Datadog and sort through them, since Datadog automatically tags them with GCE host tags and any GCE labels you may have added.

Related integrations include:

IntegrationDescription
App EnginePaaS (platform as a service) to build scalable applications
Big QueryEnterprise data warehouse
BigtableNoSQL Big Data database service
CloudSQLMySQL database service
Cloud APIsProgrammatic interfaces for all Google Cloud Platform services
Cloud ComposerA fully managed workflow orchestration service
Cloud DataprocA cloud service for running Apache Spark and Apache Hadoop clusters
Cloud FilestoreHigh-performance, fully managed file storage
Cloud FirestoreA flexible, scalable database for mobile, web, and server development
Cloud InterconnectHybrid connectivity
Cloud IoTSecure device connection and management
Cloud Load BalancingDistribute load-balanced compute resources
Cloud Memorystore for RedisA fully managed in-memory data store service
Cloud RouterExchange routes between your VPC and on-premises networks by using BGP
Cloud RunManaged compute platform that runs stateless containers through HTTP
Cloud TasksDistributed task queues
Cloud TPUTrain and run machine learning models
Compute EngineHigh performance virtual machines
Container EngineKubernetes, managed by google
DatastoreNoSQL database
FirebaseMobile platform for application development
FunctionsServerless platform for building event-based microservices
Machine LearningMachine learning services
Pub/SubReal-time messaging service
SpannerHorizontally scalable, globally consistent, relational database service
Stackdriver LoggingReal-time log management and analysis
StorageUnified object storage
VPNManaged network functionality

Setup

Metric Collection

Installation

The Datadog <> Google Cloud integration uses Service Accounts to create an API connection between Google Cloud and Datadog. Below are instructions for creating a service account and providing Datadog with service account credentials to begin making API calls on your behalf.

  1. Navigate to the Google Cloud credentials page for the Google Cloud project where you would like to setup the Datadog integration.
  2. Press Create credentials and then select Service account key.

  3. In the Service account dropdown, select New service account.

  4. Give the service account a unique name.

  5. For Role, select Compute engine —> Compute Viewer, Monitoring —> Monitoring Viewer, and Cloud Asset —> Cloud Asset Viewer

    Note: These roles allow us to collect metrics, tags, events, and user labels on your behalf.

  6. Select JSON as the key type, and press create. Take note where this file is saved, as it is needed to complete the installation.

  7. Navigate to the Datadog Google Cloud Integration tile.

  8. On the Configuration tab, select Upload Key File to integrate this project with Datadog.

  9. Optionally, you can use tags to filter out hosts from being included in this integration. Detailed instructions on this can be found below.

  10. Press Install/Update.

  11. If you want to monitor multiple projects, use one of the following methods:

    • Repeat the process above to use multiple service accounts.
    • Use the same service account by updating the project_id in the JSON file downloaded in step 6. Then upload the file to Datadog as described in steps 7-10.

Google Cloud billing, the Stackdriver Monitoring API, and the Compute Engine API must all be enabled for the project(s) you wish to monitor.

Configuration

Optionally, you can limit the GCE instances that are pulled into Datadog by entering tags in the Limit Metric Collection textbox. Only hosts that match one of the defined tags are imported into Datadog. You can use wildcards (? for single character, * for multi-character) to match many hosts, or ! to exclude certain hosts. This example includes all c1* sized instances, but excludes staging hosts:

datadog:monitored,env:production,!env:staging,instance-type:c1.*

Log Collection

For applications running in GCE or GKE, the Datadog Agent can be used to collect logs locally. GCP service logs are collected via Stackdriver and sent to a Cloud Pub/Sub with a HTTP Push forwarder. The log collection requires 5 steps:

  1. If you haven’t already, set up the Google Cloud platform integration first.
  2. Create a new Cloud Pub/Sub.
  3. Validate your Datadog domain so that logs can be pushed from GCP to Datadog.
  4. Setup the Pub/Sub to forward logs to Datadog.
  5. Configure exports from Stackdriver logs to the Pub/Sub.

Warning: Pub/Sub are subject to Google Cloud quotas and limitations. If the number of logs you have is higher than those limitations, we recommend to split your logs over several Pub/Subs. See the Monitor the Log Forwarding section for information on how to setup a monitor to be automatically notified if you get close to those limits.

Create a Cloud Pub Sub

  1. Go to the Cloud Pub Sub console and create a new topic.

  2. Give that topic an explicit name such as export-logs-to-datadog and Save.

Validate the Datadog Domain

To validate the domain, ask Google to generate an HTML file that is used as a unique identifier. This allows Google to validate the Datadog endpoint and forwards logs to it.

  1. Connect to the Google Search Console.
  2. In the URL section add: https://gcp-intake.logs.datadoghq.com/v1/input/<DATADOG_API_KEY> (find your Datadog API key here).
  3. Download the HTML file locally:

  4. Push this HTML file to Datadog with the following command:

    curl -X POST -H "Content-type: application/json" -d '{"file_contents": "google-site-verification: <GOOGLE_FILE_NAME>.html"}' "https://app.datadoghq.com/api/latest/integration/gcp_logs_site_verification?api_key=<DATADOG_API_KEY>&application_key=<DATADOG_APPLICATION_KEY>"
    

    <DATADOG_API_KEY> and <DATADOG_APPLICATION_KEY> can be found in the API Datadog section. The expected result of this command is {}.

  5. Click Verify on the Google console and wait until the confirmation message shows that it worked.

  6. Go to the API credentials page in the GCP console and click on add domain

  7. Enter the same endpoint as earlier and click add:

Once this is done, click on the Search Console link of the pop-up to confirm that it was properly enabled as shown below:

The GCP project is now ready to forward logs from the Pub/Sub to Datadog.

To validate the domain, ask Google to generate an HTML file that is used as a unique identifier. This allows Google to validate the Datadog endpoint and forwards logs to it.

  1. Connect to the Google Search Console.
  2. In the URL section add: https://gcp-intake.logs.datadoghq.eu/v1/input/<DATADOG_API_KEY> (find your Datadog API key here).
  3. Download the HTML file locally:

  4. Push this HTML file to Datadog with the following command, test:

    curl -X POST -H "Content-type: application/json" -d '{"file_contents": "google-site-verification: <GOOGLE_FILE_NAME>.html"}' "https://app.datadoghq.eu/api/latest/integration/gcp_logs_site_verification?api_key=<DATADOG_API_KEY>&application_key=<DATADOG_APPLICATION_KEY>"
    

    <DATADOG_API_KEY> and <DATADOG_APPLICATION_KEY> can be found in the API Datadog section. The expected result of this command is {}.

  5. Click Verify on the Google console and wait until the confirmation message shows that it worked.

  6. Go to the API credentials page in the GCP console and click on add domain

  7. Enter the same endpoint as earlier and click add:

Once this is done, click on the Search Console link of the pop-up to confirm that it was properly enabled as shown below:

The GCP project is now ready to forward logs from the Pub/Sub to Datadog.

Configure the Pub/Sub to forward logs to Datadog

  1. Go back to the Pub/Sub that was previously created, and add a new subscription:

  2. Select the Push method and enter the following: https://gcp-intake.logs.datadoghq.com/v1/input/<DATADOG_API_KEY>/

  3. Hit Create at the bottom.

The Pub/Sub is now ready to receive logs from Stackdriver and forward them to Datadog.

Note: If you see an error here at step 3, it means that the Datadog site was not validated. Refer to the domain validation steps and make sure the domain is validated.

  1. Go back to the Pub/Sub that was previously created, and add a new subscription:

  2. Select the Push method and enter the following: https://gcp-intake.logs.datadoghq.eu/v1/input/<DATADOG_API_KEY>/

  3. Hit Create at the bottom.

The Pub/Sub is now ready to receive logs from Stackdriver and forward them to Datadog.

Note: If you see an error here at step 3, it means that the Datadog site was not validated. Refer to the domain validation steps and make sure the domain is validated.

Export logs from Stackdriver to the Pub/Sub

  1. Go to the Stackdriver page and filter the logs that need to be exported.
  2. Hit Create Export and name the sink accordingly.
  3. Choose Cloud Pub/Sub as the destination and select the Pub/Sub that was created for that purpose. Note that the Pub/Sub can be located in a different project.

  4. Hit Create and wait for the confirmation message to show up.

Note: It is possible to create several exports from Stackdriver to the same Pub/Sub with different sinks.

Warning: Pub/Sub are subject to Google Cloud quotas and limitations. If the number of logs you have is higher than those limitations, we recommend to split your logs over several Pub/Subs. See the Monitor the Log Forwarding section for information on how to setup a monitor to be automatically notified if you get close to those limits.

Monitor the Log Forwarding

Pub/Sub are subject to Google Cloud quotas and limitations. If the number of logs you have is higher than those limitations, we recommend to split your logs over several Pub/Subs.

To be automatically notified when you reach this quota, activate the Pub/Sub metric integration and setup a monitor on the metric gcp.pubsub.subscription.backlog_bytes filtered on the subscription that export logs to Datadog to make sure it never goes above 1MB as per the below example:



Data Collected

Metrics

See the individual Google Cloud integration pages for metrics.

Events

All service events generated by your Google Cloud Platform are forwarded to your Datadog event stream. Other events captured in Stackdriver are not currently available, but will be in the future with the Datadog Log management product.

Service Checks

The Google Cloud Platform integration does not include any service checks.

Troubleshooting

Incorrect metadata for user defined gcp.logging metrics?

For non-standard gcp.logging metrics (i.e. metrics beyond Datadog’s out of the box logging metrics), the metadata applied may not be consistent with Stackdriver.

In these cases, the metadata should be manually set by navigating to the metric summary page, searching and selecting the metric in question, and clicking the Pencil icon next to metadata.

Need help? Contact Datadog support.

Further Reading

Knowledge Base

Tags Assigned

Tags are automatically assigned based on a variety of configuration options with regards to Google Cloud Platform and the Google Compute Engine. The following tags are automatically assigned:

  • Zone
  • Instance-type
  • Instance-id
  • Automatic-restart
  • On-host-maintenance
  • Project
  • Numeric_project_id
  • Name

Also, any hosts with <key>:<value> labels are tagged accordingly.