- Essentials
- In The App
- Infrastructure
- Application Performance
- Log Management
- Security Platform
- UX Monitoring
- Administration
Connect to Google Cloud Platform to see all your Google Compute Engine (GCE) hosts in Datadog. You can see your hosts in the infrastructure overview in Datadog and sort through them, since Datadog automatically tags them with GCE host tags and any GCE labels you may have added.
Integration | Description |
---|---|
App Engine | PaaS (platform as a service) to build scalable applications |
Big Query | Enterprise data warehouse |
Bigtable | NoSQL Big Data database service |
Cloud SQL | MySQL database service |
Cloud APIs | Programmatic interfaces for all Google Cloud Platform services |
Cloud Composer | A fully managed workflow orchestration service |
Cloud Dataproc | A cloud service for running Apache Spark and Apache Hadoop clusters |
Cloud Filestore | High-performance, fully managed file storage |
Cloud Firestore | A flexible, scalable database for mobile, web, and server development |
Cloud Interconnect | Hybrid connectivity |
Cloud IoT | Secure device connection and management |
Cloud Load Balancing | Distribute load-balanced compute resources |
Cloud Memorystore for Redis | A fully managed in-memory data store service |
Cloud Router | Exchange routes between your VPC and on-premises networks by using BGP |
Cloud Run | Managed compute platform that runs stateless containers through HTTP |
Cloud Tasks | Distributed task queues |
Cloud TPU | Train and run machine learning models |
Compute Engine | High performance virtual machines |
Container Engine | Kubernetes, managed by google |
Datastore | NoSQL database |
Firebase | Mobile platform for application development |
Functions | Serverless platform for building event-based microservices |
Kubernetes Engine | Cluster manager and orchestration system |
Machine Learning | Machine learning services |
Pub/Sub | Real-time messaging service |
Spanner | Horizontally scalable, globally consistent, relational database service |
Cloud Logging | Real-time log management and analysis |
Storage | Unified object storage |
VPN | Managed network functionality |
The Datadog <> Google Cloud integration uses Service Accounts to create an API connection between Google Cloud and Datadog. Below are instructions for creating a service account and providing Datadog with service account credentials to begin making API calls on your behalf.
Note: Google Cloud billing, the Cloud Monitoring API, the Compute Engine API, and the Cloud Asset API must all be enabled for the project(s) you wish to monitor.
Navigate to the Google Cloud credentials page for the Google Cloud project where you would like to setup the Datadog integration.
Click Create credentials (near the top) and select Service account.
Give the service account a unique name and click Create.
Add the following roles: Compute Viewer, Monitoring Viewer, and Cloud Asset Viewer. Click Done.
Note: You must be a Service Account Key Admin to select Compute Engine and Cloud Asset roles. All selected roles allow Datadog to collect metrics, tags, events, and user labels on your behalf.
At the bottom of the page, find your Service Accounts and select the one you just created. Click Add Key -> Create new key, and choose JSON as the type. Click Create and Save. Take note of where this file is saved, as it is needed to complete the installation.
Navigate to the Datadog Google Cloud Integration tile.
On the Configuration tab, select Upload Key File to integrate this project with Datadog.
Optionally, you can use tags to filter out hosts from being included in this integration. Detailed instructions on this can be found below.
Press Install/Update.
If you want to monitor multiple projects, use one of the following methods:
project_id
in the JSON file downloaded in step 6. Then upload the file to Datadog as described in steps 7-10.Optionally, you can limit the GCE instances that are pulled into Datadog by entering tags in the Limit Metric Collection textbox under a given project’s dropdown menu. Only hosts that match one of the defined tags are imported into Datadog. You can use wildcards (?
for single character, *
for multi-character) to match many hosts, or !
to exclude certain hosts. This example includes all c1*
sized instances, but excludes staging hosts:
datadog:monitored,env:production,!env:staging,instance-type:c1.*
See Google’s documentation on Creating and managing labels for more details.
For applications running in GCE or GKE, the Datadog Agent can be used to collect logs locally. GCP service logs are collected with Google Cloud Logging and sent to a Cloud Pub/Sub with a HTTP Push forwarder. The log collection requires 5 steps:
Warning: Pub/subs are subject to Google Cloud quotas and limitations. If the number of logs you have is higher than those limitations, Datadog recommends you split your logs over several topics. See the Monitor the Log Forwarding section for information on how to set up a monitor to be automatically notified if you get close to those limits.
Go to the Cloud Pub Sub console and create a new topic.
Give that topic an explicit name such as export-logs-to-datadog
and Save.
Go back to the Pub/Sub Topics overview page, and add select Subscriptions
in the left hand navigation. Select Create Subscription
.
Create a subscription ID and select the topic you previously created.
Select the Push
method and enter the following: https://gcp-intake.logs.
./api/v2/logs?dd-api-key=<DATADOG_API_KEY>&dd-protocol=gcp
You can create an API key or pick an existing API key in Datadog Organization Settings -> API Keys.
Configure any additional options, such as Subscription expiration, Acknowledgment deadline, Message retention duration, or Dead lettering.
Hit Create
at the bottom.
The Pub/Sub is ready to receive logs from Google Cloud Logging and forward them to Datadog.
Go to the Logs Explorer page and filter the logs that need to be exported.
From the Actions menu, select Create Sink.
Provide a name for the sink.
Choose Cloud Pub/Sub as the destination and select the pub/sub that was created for that purpose. Note: The pub/sub can be located in a different project.
Click Create Sink and wait for the confirmation message to show up.
Note: It is possible to create several exports from Google Cloud Logging to the same Pub/Sub with different sinks.
Warning: Pub/subs are subject to Google Cloud quotas and limitations. If the number of logs you have is higher than those limitations, Datadog recommends you split your logs over several topics. See the Monitor the Log Forwarding section for information on how to setup a monitor to be automatically notified if you get close to those limits.
Pub/subs are subject to Google Cloud quotas and limitations. If the number of logs you have is higher than those limitations, Datadog recommends you split your logs over several topics, using different filters.
To be automatically notified when you reach this quota, activate the Pub/Sub metric integration and set up a monitor on the metric gcp.pubsub.subscription.num_outstanding_messages
. Filter this monitor on the subscription that exports logs to Datadog to make sure it never goes above 1000, as per the below example:
See the individual Google Cloud integration pages for metrics.
All service events generated by your Google Cloud Platform are forwarded to your Datadog event stream.
The Google Cloud Platform integration does not include any service checks.
Tags are automatically assigned based on a variety of configuration options with regards to Google Cloud Platform and the Google Compute Engine. The following tags are automatically assigned:
Additionally, Datadog collects the following as tags:
<key>:<value>
labels.For non-standard gcp.logging metrics, such as metrics beyond Datadog’s out of the box logging metrics), the metadata applied may not be consistent with Google Cloud Logging.
In these cases, the metadata should be manually set by navigating to the metric summary page, searching and selecting the metric in question, and clicking the pencil icon next to the metadata.
Need help? Contact Datadog support.