Join us at the Dash conference! July 16-17, NYC

Google Cloud Platform

Crawler Crawler

Overview

Connect to Google Cloud Platform to see all your Google Compute Engine (GCE) hosts in Datadog. You can see your hosts in the infrastructure overview in Datadog and sort through them, since Datadog automatically tags them with GCE host tags and any GCE labels you may have added.

Related integrations include:

Integration Description
App Engine PaaS (platform as a service) to build scalable applications
Big Query Enterprise data warehouse
Bigtable NoSQL Big Data database service
CloudSQL MySQL database service
Cloud APIs Programmatic interfaces for all Google Cloud Platform services
Cloud Composer A fully managed workflow orchestration service
Cloud Dataproc A cloud service for running Apache Spark and Apache Hadoop clusters
Cloud Filestore High-performance, fully managed file storage
Cloud Firestore A flexible, scalable database for mobile, web, and server development
Cloud Interconnect Hybrid connectivity
Cloud IoT Secure device connection and management
Cloud Load Balancing Distribute load-balanced compute resources
Cloud Memorystore for Redis A fully managed in-memory data store service
Cloud Router Exchange routes between your VPC and on-premises networks by using BGP
Cloud Run Managed compute platform that runs stateless containers through HTTP
Cloud Tasks Distributed task queues
Cloud TPU Train and run machine learning models
Compute Engine High performance virtual machines
Container Engine Kubernetes, managed by google
Datastore NoSQL database
Firebase Mobile platform for application development
Functions Serverless platform for building event-based microservices
Machine Learning Machine learning services
Pub/Sub Real-time messaging service
Spanner Horizontally scalable, globally consistent, relational database service
Stackdriver Logging Real-time log management and analysis
Storage Unified object storage
VPN Managed network functionality

Setup

Metric Collection

Installation

The Datadog <> Google Cloud integration uses Service Accounts to create an API connection between Google Cloud and Datadog. Below are instructions for creating a service account and providing Datadog with service account credentials to begin making API calls on your behalf.

  1. Navigate to the Google Cloud credentials page for the Google Cloud project where you would like to setup the Datadog integration.
  2. Press Create credentials and then select Service account key.

  3. In the Service account dropdown, select New service account.

  4. Give the service account a unique name.

  5. For Role, select Compute engine —> Compute Viewer and Monitoring —> Monitoring Viewer.

    Note: these roles allow us to collect metrics, tags, events, and GCE labels on your behalf.

  6. Select JSON as the key type, and press create. Take note where this file is saved, as it is needed to complete the installation.

  7. Navigate to the Datadog Google Cloud Integration tile.

  8. Select Upload Key File to integrate this project with Datadog.

  9. Optionally, you can use tags to filter out hosts from being included in this integration. Detailed instructions on this can be found below.

  10. Press Install/Update.

  11. If you want to monitor multiple projects, use one of the following methods:

    • Repeat the process above to use multiple service accounts.
    • Use the same service account by updating the project_id in the JSON file downloaded in step 6. Then upload the file to Datadog as described in steps 7-10.

Google Cloud billing, the Stackdriver Monitoring API, and the Compute Engine API must all be enabled for the project(s) you wish to monitor.

Configuration

Optionally, you can limit the GCE instances that are pulled into Datadog by entering tags in the Limit Metric Collection textbox. Only hosts that match one of the defined tags are imported into Datadog. You can use wildcards (? for single character, * for multi-character) to match many hosts, or ! to exclude certain hosts. This example includes all c1* sized instances, but excludes staging hosts:

datadog:monitored,env:production,!env:staging,instance-type:c1.*

Log Collection

For applications running in GCE or GKE, the Datadog Agent can be used to collect logs locally. GCP service logs are collected via Stackdriver and sent to a Cloud Pub/Sub with a HTTP Push forwarder. The log collection requires 4 steps:

  1. Create a new Cloud Pub/Sub.
  2. Validate your Datadog domain so that logs can be pushed from GCP to Datadog.
  3. Setup the Pub/Sub to forward logs to Datadog.
  4. Configure exports from Stackdriver logs to the Pub/Sub.

Create a Cloud Pub Sub

  1. Go to the Cloud Pub Sub console and create a new topic.

  2. Give that topic an explicit name such as export-logs-to-datadog and Save.

Validate the Datadog Domain

To validate the domain, ask Google to generate an HTML file that is used as a unique identifier. This allows Google to validate the Datadog endpoint and forwards logs to it.

  1. Connect to the Google Search Console.
  2. In the URL section add: https://gcp-intake.logs.datadoghq.com/v1/input/<DATADOG_API_KEY> (find your Datadog API key here).
  3. Download the HTML file locally:

  4. Push this HTML file to Datadog with the following command:

    curl -X POST -H "Content-type: application/json" -d '{"file_contents": "google-site-verification: <GOOGLE_FILE_NAME>.html"}' "https://app.datadoghq.com/api/latest/integration/gcp_logs_site_verification?api_key=<DATADOG_API_KEY>&application_key=<DATADOG_APPLICATION_KEY>"
    

    <DATADOG_API_KEY> and <DATADOG_APPLICATION_KEY> can be found in the API Datadog section. The expected result of this command is {}.

  5. Click Verify on the Google console and wait until the confirmation message shows that it worked.

  6. Go to the API credentials page in the GCP console and click on add domain

  7. Enter the same endpoint as earlier and click add:

Once this is done, click on the Search Console link of the pop-up to confirm that it was properly enabled as shown below:

The GCP project is now ready to forward logs from the Pub/Sub to Datadog.

To validate the domain, ask Google to generate an HTML file that is used as a unique identifier. This allows Google to validate the Datadog endpoint and forwards logs to it.

  1. Connect to the Google Search Console.
  2. In the URL section add: https://gcp-intake.logs.datadoghq.eu/v1/input/<DATADOG_API_KEY> (find your Datadog API key here).
  3. Download the HTML file locally:

  4. Push this HTML file to Datadog with the following command, test:

    curl -X POST -H "Content-type: application/json" -d '{"file_contents": "google-site-verification: <GOOGLE_FILE_NAME>.html"}' "https://app.datadoghq.eu/api/latest/integration/gcp_logs_site_verification?api_key=<DATADOG_API_KEY>&application_key=<DATADOG_APPLICATION_KEY>"
    

    <DATADOG_API_KEY> and <DATADOG_APPLICATION_KEY> can be found in the API Datadog section. The expected result of this command is {}.

  5. Click Verify on the Google console and wait until the confirmation message shows that it worked.

  6. Go to the API credentials page in the GCP console and click on add domain

  7. Enter the same endpoint as earlier and click add:

Once this is done, click on the Search Console link of the pop-up to confirm that it was properly enabled as shown below:

The GCP project is now ready to forward logs from the Pub/Sub to Datadog.

Configure the Pub/Sub to forward logs to Datadog

  1. Go back to the Pub/Sub that was previously created, and add a new subscription:

  2. Select the Push method and enter the following: https://gcp-intake.logs.datadoghq.com/v1/input/<DATADOG_API_KEY>/

  3. Hit Create at the bottom.

The Pub/Sub is now ready to receive logs from Stackdriver and forward them to Datadog.

Note: If you see an error here at step 3, it means that the Datadog site was not validated. Refer to the domain validation steps and make sure the domain is validated.

  1. Go back to the Pub/Sub that was previously created, and add a new subscription:

  2. Select the Push method and enter the following: https://gcp-intake.logs.datadoghq.eu/v1/input/<DATADOG_API_KEY>/

  3. Hit Create at the bottom.

The Pub/Sub is now ready to receive logs from Stackdriver and forward them to Datadog.

Note: If you see an error here at step 3, it means that the Datadog site was not validated. Refer to the domain validation steps and make sure the domain is validated.

Export logs from Stackdriver to the Pub/Sub

  1. Go to the Stackdriver page and filter the logs that need to be exported.
  2. Hit Create Export and name the sink accordingly.
  3. Choose Cloud Pub/Sub as the destination and select the Pub/Sub that was created for that purpose. Note that the Pub/Sub can be located in a different project.

  4. Hit Create and wait for the confirmation message to show up.

Note: It is possible to create several exports from Stackdriver to the same Pub/Sub with different sinks.

Data Collected

Metrics

See the individual Google Cloud integration pages for metrics.

Events

All service events generated by your Google Cloud Platform are forwarded to your Datadog event stream. Other events captured in Stackdriver are not currently available, but will be in the future with the Datadog Log management product.

Service Checks

The Google Cloud Platform integration does not include any service checks.

Troubleshooting

Incorrect metadata for user defined gcp.logging metrics?

For non-standard gcp.logging metrics (i.e. metrics beyond Datadog’s out of the box logging metrics), the metadata applied may not be consistent with Stackdriver.

In these cases, the metadata should be manually set by navigating to the metric summary page, searching and selecting the metric in question, and clicking the Pencil icon next to metadata.

Need help? Contact Datadog support.

Further Reading

Knowledge Base

Tags Assigned

Tags are automatically assigned based on a variety of configuration options with regards to Google Cloud Platform and the Google Compute Engine. The following tags are automatically assigned:

  • Zone
  • Instance-type
  • Instance-id
  • Automatic-restart
  • On-host-maintenance
  • Project
  • Numeric_project_id
  • Name

Also, any hosts with <key>:<value> labels are tagged accordingly.