Connect to Google Cloud Platform to see all your Google Compute Engine (GCE) hosts in Datadog. You can see your hosts in the infrastructure overview in Datadog and sort through them, since Datadog automatically tags them with GCE host tags and any GCE labels you may have added.
Datadog's GCP integration is built to collect all Google Cloud metrics. Datadog strives to continually update the docs to show every sub-integration, but cloud services rapidly release new metrics and services so the list of integrations are sometimes lagging.
This method enables you to monitor all projects visible to a service account by assigning IAM roles in the relevant projects. You can assign these roles to projects individually, or you can configure Datadog to monitor groups of projects by assigning these roles at the organization or folder level. Assigning roles in this way allows Datadog to automatically discover and monitor all projects in the given scope, including any new projects that may be added to the group in the future.
Prerequisites
If your organization restricts identities by domain, you must add Datadog’s customer identity as an allowed value in your policy. Datadog’s customer identity: C0147pk0i
Service account impersonation and automatic project discovery rely on having certain roles and APIs enabled to monitor projects. Before you start, ensure the following APIs are enabled for the projects you want to monitor:
Go to the Permissions tab and click on Grant Access.
Paste your Datadog principal into the New principals text box.
Assign the role of Service Account Token Creator and click Save.
Note: If you previously configured access using a shared Datadog principal, you can revoke the permission for that principal after you complete these steps.
3. Complete the integration setup in Datadog
In your Google Cloud console, navigate to the Service Account > Details tab. There, you can find the email associated with this Google service account. It resembles <sa-name>@<project-id>.iam.gserviceaccount.com.
Copy this email.
Return to the integration configuration tile in Datadog (where you copied your Datadog principal in the previous section
).
In the box under Add Service Account Email, paste the email you previously copied.
Click on Verify and Save Account.
In approximately fifteen minutes, metrics appear in Datadog.
4. Assign roles to other projects (optional)
Automatic project discovery simplifies the process of adding additional projects to be monitored. If you grant your service account access to other projects, folders, or orgs, Datadog discovers these projects (and any projects nested in the folders or orgs) and automatically adds them to your integration tile.
Make sure you have the appropriate permissions to assign roles at the desired scope:
Project IAM Admin (or higher)
Folder Admin
Organization Admin
In the Google Cloud console, go to the IAM page.
Select a project, folder, or organization.
To grant a role to a principal that does not already have other roles on the resource, click Grant Access, then enter the email of the service account you created earlier.
Assign the following roles:
Compute Viewer
Monitoring Viewer
Cloud Asset Viewer
Note: The Browser role is only required in the default project of the service account.
Click Save.
Configuration
Optionally, you can limit the GCE instances that are pulled into Datadog by entering tags in the Limit Metric Collection textbox under a given project’s dropdown menu. Only hosts that match one of the defined tags are imported into Datadog. You can use wildcards (? for single character, * for multi-character) to match many hosts, or ! to exclude certain hosts. This example includes all c1* sized instances, but excludes staging hosts:
Forward logs from your Google Cloud services to Datadog using Google Cloud Dataflow
and the Datadog template
. This method provides both compression and batching of events before forwarding to Datadog. Follow the instructions in this section to:
1
. Create a Pub/Sub topic
and pull subscription
to receive logs from a configured log sink 2
. Create a custom Dataflow worker service account to provide least privilege
to your Dataflow pipeline workers 3
. Create a log sink
to publish logs to the Pub/Sub topic 4
. Create a Dataflow job using the Datadog template
to stream logs from the Pub/Sub subscription to Datadog
You have full control over which logs are sent to Datadog through the logging filters you create in the log sink, including GCE and GKE logs. See Google’s Logging query language page
for information about writing filters.
To collect logs from applications running in GCE or GKE, you can also use the Datadog Agent
.
If you have a Google Cloud VPC, the Push subscription cannot access endpoints outside the VPC
The Push subscription does not provide compression or batching of events, and as such is only suitable for a very low volume of logs
Documentation for the Push subscription is only maintained for troubleshooting or modifying legacy setups. Use a Pull subscription with the Datadog Dataflow template to forward your Google Cloud logs to Datadog instead.
1. Create a Cloud Pub/Sub topic and subscription
Go to the Cloud Pub/Sub console
and create a new topic. Select the option Add a default subscription to simplify the setup.
Note: You can also manually configure a Cloud Pub/Sub subscription
with the Pull delivery type. If you manually create your Pub/Sub subscription, leave the Enable dead lettering box unchecked. For more details, see Unsupported Pub/Sub features
.
Give that topic an explicit name such as export-logs-to-datadog and click Create.
Create an additional topic and default subscription to handle any log messages rejected by the Datadog API. The name of this topic is used within the Datadog Dataflow template as part of the path configuration for the outputDeadletterTopictemplate parameter
. When you have inspected and corrected any issues in the failed messages, send them back to the original export-logs-to-datadog topic by running a Pub/Sub to Pub/Sub template
job.
Datadog recommends creating a secret in Secret Manager
with your valid Datadog API key value, for later use in the Datadog Dataflow template.
Warning: Cloud Pub/Subs are subject to Google Cloud quotas and limitations
. If the number of logs you have exceeds those limitations, Datadog recommends you split your logs over several topics. See the Monitor the Pub/Sub Log Forwarding section
for information on setting up monitor notifications if you approach those limits.
2. Create a custom Dataflow worker service account
The default behavior for Dataflow pipeline workers is to use your project’s Compute Engine default service account
, which grants permissions to all resources in the project. If you are forwarding logs from a Production environment, you should instead create a custom worker service account with only the necessary roles and permissions, and assign this service account to your Dataflow pipeline workers.
Go to the Service Accounts
page in the Google Cloud console and select your project.
Click CREATE SERVICE ACCOUNT and give the service account a descriptive name. Click CREATE AND CONTINUE.
Add the roles in the required permissions table and click DONE.
Allow this service account to read and write to the Cloud Storage bucket specified for staging files
Note: If you don’t create a custom service account for the Dataflow pipeline workers, ensure that the default Compute Engine service account has the required permissions above.
Choose Cloud Pub/Sub as the destination and select the Cloud Pub/Sub topic that was created for that purpose. Note: The Cloud Pub/Sub topic can be located in a different project.
Choose the logs you want to include in the sink with an optional inclusion or exclusion filter. You can filter the logs with a search query, or use the sample function
. For example, to include only 10% of the logs with a severity level of ERROR, create an inclusion filter with severity="ERROR" AND sample(insertId, 0.01).
Click Create Sink.
Note: It is possible to create several exports from Google Cloud Logging to the same Cloud Pub/Sub topic with different sinks.
Give the job a name and select a Dataflow regional endpoint.
Select Pub/Sub to Datadog in the Dataflow template dropdown, and the Required parameters section appears. a. Select the input subscription in the Pub/Sub input subscription dropdown. b. Enter the following in the Datadog Logs API URL field:
https://
Note: Ensure that the Datadog site selector on the right of the page is set to your Datadog site
before copying the URL above.
c. Select the topic created to receive message failures in the Output deadletter Pub/Sub topic dropdown. d. Specify a path for temporary files in your storage bucket in the Temporary location field.
If you created a secret in Secret Manager with your Datadog API key value as mentioned in step 1
, enter the resource name of the secret in the Google Cloud Secret Manager ID field.
See Template parameters
in the Dataflow template for details on using the other available options:
apiKeySource=KMS with apiKeyKMSEncryptionKey set to your Cloud KMS
key ID and apiKey set to the encrypted API key
Not recommended: apiKeySource=PLAINTEXT with apiKey set to the plaintext API key
If you created a custom worker service account, select it in the Service account email dropdown.
Click RUN JOB.
Validation
New logging events delivered to the Cloud Pub/Sub topic appear in the Datadog Log Explorer
.
Cloud Pub/Subs are subject to Google Cloud quotas and limitations
. If the number of logs you have exceeds those limitations, Datadog recommends you split your logs over several topics, using different filters.
To be automatically notified when you reach this quota, activate the Pub/Sub metric integration
and set up a monitor on the metric gcp.pubsub.subscription.num_outstanding_messages. Filter this monitor on the subscription that exports logs to Datadog to make sure it never goes above 1000, as per the below example:
Monitor the Dataflow pipeline
Use Datadog’s Google Cloud Dataflow integration
to monitor all aspects of your Dataflow pipelines. You can see all your key Dataflow metrics on the out-of-the-box dashboard, enriched with contextual data such as information about the GCE instances running your Dataflow workloads, and your Pub/Sub throughput.
See the individual Google Cloud integration pages for metrics.
Events
All service events generated by your Google Cloud Platform are forwarded to your Datadog Events Explorer
.
Service Checks
The Google Cloud Platform integration does not include any service checks.
Tags
Tags are automatically assigned based on a variety of Google Cloud Platform and Google Compute Engine configuration options. The project_id tag is added to all metrics. Additional tags are collected from the Google Cloud Platform when available, and varies based on metric type.
Additionally, Datadog collects the following as tags:
Any hosts with <key>:<value> labels.
Custom labels from Google Pub/Sub, GCE, Cloud SQL, and Cloud Storage.
Troubleshooting
Incorrect metadata for user defined gcp.logging metrics?
For non-standard gcp.logging metrics, such as metrics beyond Datadog’s out of the box logging metrics
, the metadata applied may not be consistent with Google Cloud Logging.
In these cases, the metadata should be manually set by navigating to the metric summary page
, searching and selecting the metric in question, and clicking the pencil icon next to the metadata.