---
title: Google Cloud Platform
description: >-
  Google Cloud Platform is a collection of web services that together make up a
  cloud computing platform.
breadcrumbs: Docs > Integrations > Google Cloud Platform
---

# Google Cloud Platform

## Overview{% #overview %}

Use this guide to get started monitoring your Google Cloud environment. This approach simplifies the setup for Google Cloud environments with multiple projects, allowing you to maximize your monitoring coverage.

{% collapsible-section %}
#### See the full list of Google Cloud integrations

{% alert level="warning" %}
Datadog's Google Cloud integration collects [all Google Cloud metrics](https://cloud.google.com/monitoring/api/metrics_gcp). Datadog continually updates the docs to show every dependent integration, but the list of integrations is sometimes behind the latest cloud services metrics and services.
If you don't see an integration for a specific Google Cloud service, reach out to [Datadog Support](https://www.datadoghq.com/support/).
{% /alert %}

| Integration                                                                                               | Description                                                                           |
| --------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------- |
| [AI Platform](https://docs.datadoghq.com/integrations/google-cloud-ml/)                                   | Machine learning services                                                             |
| AlloyDB for PostgreSQL                                                                                    | Fully managed PostgreSQL-compatible database service                                  |
| API Gateway                                                                                               | Fully managed API gateway for serverless backends                                     |
| Apigee                                                                                                    | API management platform for designing, securing, and scaling APIs                     |
| [App Engine](https://docs.datadoghq.com/integrations/google-app-engine/)                                  | PaaS (platform as a service) to build scalable applications                           |
| App Hub                                                                                                   | Manage and monitor application workloads across Google Cloud                          |
| Application Integration                                                                                   | Managed integration platform for Google Cloud services                                |
| Artifact Registry                                                                                         | Universal repository for container images and build artifacts                         |
| Backup and DR Service                                                                                     | Managed backup and disaster recovery service                                          |
| Backup for GKE                                                                                            | Backup and restore service for GKE clusters and workloads                             |
| Bare Metal Solution                                                                                       | Run bare metal workloads on Google Cloud infrastructure                               |
| [BigQuery](https://docs.datadoghq.com/integrations/google-cloud-bigquery/)                                | Enterprise data warehouse                                                             |
| BigQuery BI Engine                                                                                        | In-memory analysis service for fast BigQuery queries                                  |
| BigQuery Data Transfer Service                                                                            | Automated data ingestion from Google and external sources into BigQuery               |
| BigQuery Engine for Apache Flink                                                                          | Managed Apache Flink service for stream processing on Google Cloud                    |
| BigQuery Storage API                                                                                      | Fast and efficient read access to BigQuery managed storage                            |
| [Bigtable](https://docs.datadoghq.com/integrations/google-cloud-bigtable/)                                | NoSQL Big Data database service                                                       |
| Blockchain Node Engine                                                                                    | Fully managed node-hosting service for blockchain networks                            |
| Certificate Authority Service                                                                             | Managed private certificate authority service                                         |
| Certificate Manager                                                                                       | Manage and deploy SSL/TLS certificates at scale                                       |
| Cloud Billing                                                                                             | Manage and monitor billing budgets for Google Cloud projects                          |
| [Cloud Composer](https://docs.datadoghq.com/integrations/google-cloud-composer/)                          | A fully managed workflow orchestration service                                        |
| Cloud Data Loss Prevention                                                                                | Discover, classify, and protect sensitive data                                        |
| Cloud Deploy                                                                                              | Managed continuous delivery to Google Cloud targets                                   |
| Cloud DNS                                                                                                 | Scalable, reliable, managed domain name system service                                |
| Cloud Healthcare API                                                                                      | Store, process, and access healthcare data in Google Cloud                            |
| Cloud IDS                                                                                                 | Cloud-native managed intrusion detection system                                       |
| [Cloud Interconnect](https://docs.datadoghq.com/integrations/google-cloud-interconnect/)                  | Hybrid connectivity                                                                   |
| Cloud Key Management Service                                                                              | Manage cryptographic keys for Google Cloud services                                   |
| [Cloud Load Balancing](https://docs.datadoghq.com/integrations/google-cloud-loadbalancing/)               | Distribute load-balanced compute resources                                            |
| [Cloud Logging](https://docs.datadoghq.com/integrations/google-stackdriver-logging/)                      | Real-time log management and analysis                                                 |
| Cloud Monitoring                                                                                          | Full-stack monitoring and observability for Google Cloud                              |
| [Cloud Router](https://docs.datadoghq.com/integrations/google-cloud-router/)                              | Exchange routes between your VPC and on-premises networks by using BGP                |
| [Cloud Run](https://docs.datadoghq.com/integrations/google-cloud-run/)                                    | Managed compute platform that runs stateless containers over HTTP                     |
| [Cloud Run functions](https://docs.datadoghq.com/integrations/google-cloud-functions/)                    | Serverless platform for building event-based microservices                            |
| Cloud Service Mesh                                                                                        | Managed service mesh for microservices traffic management                             |
| [Cloud SQL](https://docs.datadoghq.com/integrations/google-cloudsql/)                                     | MySQL database service                                                                |
| Cloud SQL for MySQL                                                                                       | Query performance diagnostics for Cloud SQL databases                                 |
| [Cloud Storage](https://docs.datadoghq.com/integrations/google-cloud-storage/)                            | Unified object storage                                                                |
| Cloud Storage for Firebase                                                                                | Object storage for Firebase applications                                              |
| [Cloud Tasks](https://docs.datadoghq.com/integrations/google-cloud-tasks/)                                | Distributed task queues                                                               |
| [Cloud TPU](https://docs.datadoghq.com/integrations/google-cloud-tpu/)                                    | Train and run machine learning models                                                 |
| Cloud Trace                                                                                               | Distributed tracing to diagnose application latency                                   |
| [Cloud VPN](https://docs.datadoghq.com/integrations/google-cloud-vpn/)                                    | Managed network functionality                                                         |
| [Compute Engine](https://docs.datadoghq.com/integrations/google-compute-engine/)                          | High performance virtual machines                                                     |
| Compute Engine Autoscaler                                                                                 | Automatically scale managed instance groups based on load                             |
| Contact Center AI Insights                                                                                | Analyze customer interactions to improve contact center performance                   |
| Database Migration                                                                                        | Migrate databases to Cloud SQL, AlloyDB, and Spanner                                  |
| [Dataflow](https://docs.datadoghq.com/integrations/google-cloud-dataflow/)                                | A fully managed service for transforming and enriching data in stream and batch modes |
| Dataplex                                                                                                  | Intelligent data catalog for analytics governance                                     |
| [Dataproc](https://docs.datadoghq.com/integrations/google-cloud-dataproc/)                                | A cloud service for running Apache Spark and Apache Hadoop clusters                   |
| Dataproc Metastore                                                                                        | Fully managed Apache Hive metastore service                                           |
| [Datastore](https://docs.datadoghq.com/integrations/google-cloud-datastore/)                              | NoSQL database                                                                        |
| Datastream                                                                                                | Serverless change data capture and replication service                                |
| Dialogflow                                                                                                | Natural language understanding platform for conversational AI                         |
| Display & Video 360 API                                                                                   | Programmatic advertising and media buying platform                                    |
| Earth Engine                                                                                              | Planetary-scale platform for geospatial and environmental analysis                    |
| [Filestore](https://docs.datadoghq.com/integrations/google-cloud-filestore/)                              | High-performance, fully managed file storage                                          |
| [Firebase](https://docs.datadoghq.com/integrations/google-cloud-firebase/)                                | Mobile platform for application development                                           |
| [Firestore](https://docs.datadoghq.com/integrations/google-cloud-firestore/)                              | A flexible, scalable database for mobile, web, and server development                 |
| Firewall Insights                                                                                         | Recommendations and analysis to optimize VPC firewall rules                           |
| Fleet Engine                                                                                              | Real-time fleet management and routing service                                        |
| Google Assistant Smart Home                                                                               | Monitor smart home device interactions powered by Google Assistant                    |
| [Google Cloud API](https://docs.datadoghq.com/integrations/google-cloud-apis/)                            | Programmatic interfaces for all Google Cloud Platform services                        |
| [Google Cloud Armor](https://docs.datadoghq.com/integrations/google-cloud-armor/)                         | Network security service to help protect against denial-of-service and web attacks    |
| Google Cloud Managed Service for Apache Kafka                                                             | Managed Apache Kafka service for event streaming                                      |
| Google Distributed Cloud                                                                                  | Run Google Cloud services at the edge or on-premises                                  |
| [Google Kubernetes Engine](https://docs.datadoghq.com/integrations/google-kubernetes-engine/)             | Cluster manager and orchestration system                                              |
| [Google Kubernetes Engine (deprecated)](https://docs.datadoghq.com/integrations/google-container-engine/) | Kubernetes, managed by Google                                                         |
| Google Maps Platform                                                                                      | Maps, routes, and places APIs for location-aware applications                         |
| Google Security Operations                                                                                | Cloud-native security operations platform for threat detection and response           |
| Identity and Access Management                                                                            | Fine-grained access control for Google Cloud resources                                |
| Identity Platform                                                                                         | Customer identity and access management as a service                                  |
| Integration Connectors                                                                                    | Pre-built connectors for Google Cloud and third-party services                        |
| Live Stream API                                                                                           | Ingest and stream live and on-demand video                                            |
| Managed Service for Microsoft Active Directory                                                            | Run Microsoft Active Directory on Google Cloud infrastructure                         |
| Media CDN                                                                                                 | Global content delivery network optimized for streaming media                         |
| Memorystore                                                                                               | Managed in-memory data store service                                                  |
| Memorystore for Memcached                                                                                 | Managed Memcached for distributed in-memory caching                                   |
| [Memorystore for Redis](https://docs.datadoghq.com/integrations/google-cloud-redis/)                      | A fully managed in-memory data store service                                          |
| NetApp                                                                                                    | Enterprise file storage powered by NetApp ONTAP                                       |
| Network Connectivity                                                                                      | Connect Google Cloud and on-premises resources using multiple connectivity options    |
| Network Topology                                                                                          | Visualize and analyze Google Cloud network topology                                   |
| Oracle Database @ Google Cloud                                                                            | Run Oracle Database workloads in Google Cloud data centers                            |
| Parallelstore                                                                                             | High-performance parallel file storage for HPC and AI/ML workloads                    |
| Google Cloud Managed Service for Prometheus                                                               | Open-source monitoring and alerting for Google Cloud workloads                        |
| [Pub/Sub](https://docs.datadoghq.com/integrations/google-cloud-pubsub/)                                   | Real-time messaging service                                                           |
| Pub/Sub Lite                                                                                              | Low-cost, high-volume messaging for predictable workloads                             |
| reCAPTCHA                                                                                                 | Protect websites and apps from fraud and abuse                                        |
| Recommendations                                                                                           | Personalized product recommendations using machine learning                           |
| Retail API                                                                                                | AI-powered search and recommendations for retail applications                         |
| Secure Web Proxy                                                                                          | Cloud-native managed proxy for securing web egress traffic                            |
| Serverless VPC Access                                                                                     | Connect serverless services to VPC networks                                           |
| [Spanner](https://docs.datadoghq.com/integrations/google-cloud-spanner/)                                  | Horizontally scalable, globally consistent, relational database service               |
| Storage Transfer Service                                                                                  | Transfer data to and from Cloud Storage                                               |
| Telecom Network Automation                                                                                | Automate lifecycle management of telecom network functions                            |
| Transfer Appliance                                                                                        | Hardware appliance for transferring large datasets to Google Cloud                    |
| Translation Hub                                                                                           | Enterprise document translation service                                               |
| [Vertex AI](https://docs.datadoghq.com/integrations/google-cloud-vertex-ai/)                              | Build, train, and deploy custom machine learning (ML) models                          |
| Video Stitcher API                                                                                        | Dynamic ad insertion for video on demand and live streams                             |
| Vision AI                                                                                                 | AI-powered insights from images and video                                             |
| VM Manager                                                                                                | Manage and maintain operating systems on Compute Engine VMs                           |
| Workflows                                                                                                 | Orchestrate and automate Google Cloud and HTTP API services                           |
| Workload                                                                                                  | Monitor workload-level metrics in Google Cloud                                        |

{% /collapsible-section %}

## Setup{% #setup %}

Set up Datadog's Google Cloud integration to collect metrics and logs from your Google Cloud services.

### Prerequisites{% #prerequisites %}

{% callout %}
# Important note for users on the following Datadog sites: app.datadoghq.com, us3.datadoghq.com, us5.datadoghq.com, app.datadoghq.eu, ap1.datadoghq.com, ap2.datadoghq.com

1. If your organization restricts identities by domain, you must add Datadog's customer identity as an allowed value in your policy. Datadog's customer identity: `C0147pk0i`
{% /callout %}

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com

1. If your organization restricts identities by domain, you must add Datadog's customer identity as an allowed value in your policy. Datadog's customer identity: `C03lf3ewa`
{% /callout %}

2. Enable the following APIs for **every project** you want to monitor, including the project where the service account has been created.

{% alert level="warning" %}
Service account impersonation and automatic project discovery relies on you having certain roles and APIs enabled to monitor projects. Complete this step to avert integration issues.
{% /alert %}

{% dl %}

{% dt %}
[Cloud Monitoring API](https://console.cloud.google.com/apis/library/monitoring.googleapis.com)
{% /dt %}

{% dd %}
Allows Datadog to query your Google Cloud metric data.
{% /dd %}

{% dt %}
[Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com)
{% /dt %}

{% dd %}
Allows Datadog to discover compute instance data.
{% /dd %}

{% dt %}
[Cloud Asset API](https://console.cloud.google.com/apis/library/cloudasset.googleapis.com)
{% /dt %}

{% dd %}
Allows Datadog to request Google Cloud resources and link relevant labels to metrics as tags.
{% /dd %}

{% dt %}
[Cloud Resource Manager API](https://console.cloud.google.com/apis/library/cloudresourcemanager.googleapis.com)
{% /dt %}

{% dd %}
Allows Datadog to append metrics with the correct resources and tags.
{% /dd %}

{% dt %}
[IAM API](https://console.cloud.google.com/apis/library/iam.googleapis.com)
{% /dt %}

{% dd %}
Allows Datadog to authenticate with Google Cloud.
{% /dd %}

{% dt %}
[Cloud Billing API](https://console.cloud.google.com/apis/library/cloudbilling.googleapis.com)
{% /dt %}

{% dd %}
Allows developers to manage billing for their Google Cloud Platform projects programmatically. See the [Cloud Cost Management (CCM)](https://docs.datadoghq.com/cloud_cost_management/setup/google_cloud/) documentation for more information.
{% /dd %}

{% /dl %}

3. Ensure that any projects being monitored are not configured as [scoping projects](https://cloud.google.com/monitoring/settings#:~:text=A%20scoping%20project%20hosts%20a,is%20also%20a%20scoping%20project.) that pull in metrics from multiple other projects.

### Metric collection{% #metric-collection %}

#### Installation{% #installation %}

{% tab title="Org- and Folder-level project discovery" %}
Organization-level (or folder-level) monitoring is recommended for comprehensive coverage of all projects, including any future projects that may be created in an org or folder.

**Note**: Your [Google Cloud Identity](https://cloud.google.com/identity/docs/overview) user account must have the `Admin` role assigned to it at the desired scope to complete the setup in Google Cloud (for example, `Organization Admin`).

{% collapsible-section %}
##### 1. Create a Google Cloud service account in the default project

1. Open your [Google Cloud console](https://console.cloud.google.com/).
1. Navigate to **IAM & Admin** > **Service Accounts**.
1. Click **Create service account** at the top.
1. Give the service account a unique name.
1. Click **Done** to complete creating the service account.

{% /collapsible-section %}

{% collapsible-section %}
##### 2. Add the service account at the organization or folder level

1. In the Google Cloud console, go to the **IAM** page.
1. Select a folder or organization.
1. To grant a role to a principal that does not already have other roles on the resource, click **Grant Access**, then enter the email of the service account you created earlier.
1. Enter the service account's email address.
1. Assign the following roles:

- [Compute Viewer](https://cloud.google.com/compute/docs/access/iam#compute.viewer) provides **read-only** access to get and list Compute Engine resources
- [Monitoring Viewer](https://cloud.google.com/monitoring/access-control#monitoring_roles) provides **read-only** access to the monitoring data availabile in your Google Cloud environment
- [Cloud Asset Viewer](https://cloud.google.com/iam/docs/understanding-roles#cloudasset.viewer) provides **read-only** access to cloud assets metadata
- [Browser](https://cloud.google.com/resource-manager/docs/access-control-proj#browser) provides **read-only** access to browse the hierarchy of a project
- [Service Usage Consumer](https://cloud.google.com/service-usage/docs/access-control#serviceusage.serviceUsageConsumer) (**optional**, for multi-project environments) provides per-project cost and API quota attribution after this feature has been enabled by Datadog support
Click **Save**.
**Note**: The `Browser` role is only required in the default project of the service account. Other projects require only the other listed roles.
{% /collapsible-section %}

{% collapsible-section %}
##### 3. Add the Datadog principal to your service account

**Note**: If you previously configured access using a shared Datadog principal, you can revoke the permission for that principal after you complete these steps.

1. In Datadog, navigate to **Integrations** > [**Google Cloud Platform**](https://app.datadoghq.com/integrations/google-cloud-platform).
1. Click **Add Google Cloud Account**. If you have no configured projects, you are automatically redirected to this page.
1. Copy your Datadog principal and keep it for the next section.

{% image
   source="https://docs.dd-static.net/images/integrations/google_cloud_platform/principal-2.de65e967af6ef672299441eb286356ce.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/integrations/google_cloud_platform/principal-2.de65e967af6ef672299441eb286356ce.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The page for adding a new Google Cloud account in Datadog's Google Cloud integration tile" /%}

**Note**: Keep this window open for Section 4.
In the [Google Cloud console](https://console.cloud.google.com/), under the **Service Accounts** menu, find the service account you created in Section 1.Go to the **Permissions** tab and click **Grant Access**.
{% image
   source="https://docs.dd-static.net/images/integrations/google_cloud_platform/grant-access.4ac9c4f78e350411355e5fdebe07dd27.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/integrations/google_cloud_platform/grant-access.4ac9c4f78e350411355e5fdebe07dd27.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="Google Cloud console interface, showing the Permissions tab under Service Accounts." /%}
Paste your Datadog principal into the **New principals** text box.Assign the role of **Service Account Token Creator**.Click **Save**.
{% /collapsible-section %}

{% collapsible-section %}
##### 4. Complete the integration setup in Datadog

1. In your Google Cloud console, navigate to the **Service Account** > **Details** tab. On this page, find the email associated with this Google service account. It has the format `<SA_NAME>@<PROJECT_ID>.iam.gserviceaccount.com`.
1. Copy this email.
1. Return to the integration configuration tile in Datadog (where you copied your Datadog principal in the previous section).
1. Paste the email you copied in **Add Service Account Email**.
1. Click **Verify and Save Account**.

{% /collapsible-section %}

Metrics appear in Datadog approximately **15 minutes** after setup.

#### Best practices for monitoring multiple projects{% #best-practices-for-monitoring-multiple-projects %}

##### Enable per-project cost and API quota attribution{% #enable-per-project-cost-and-api-quota-attribution %}

By default, Google Cloud attributes the cost of monitoring API calls, as well as API quota usage, to the project containing the service account for this integration. As a best practice for Google Cloud environments with multiple projects, enable per-project cost attribution of monitoring API calls and API quota usage. With this enabled, costs and quota usage are attributed to the project being *queried*, rather than the project containing the service account. This provides visibility into the monitoring costs incurred by each project, and also helps to prevent reaching API rate limits.

To enable this feature:

1. Ensure that the Datadog service account has the [Service Usage Consumer](https://cloud.google.com/service-usage/docs/access-control#serviceusage.serviceUsageConsumer) role at the desired scope (folder or organization).
1. Click the **Enable Per Project Quota** toggle in the **Projects** tab of the [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform).

{% /tab %}

{% tab title="Project-level metric collection" %}
You can use [service account impersonation](https://cloud.google.com/iam/docs/service-account-impersonation) and automatic project discovery to integrate Datadog with [Google Cloud](https://docs.datadoghq.com/integrations/google-cloud-platform/). **Note:** Projects with a prefix of `sys-` will not be picked up as part of project discovery.

This method enables you to monitor all projects visible to a service account by assigning IAM roles in the relevant projects. You can assign these roles to projects individually, or you can configure Datadog to monitor groups of projects by assigning these roles at the organization or folder level. Assigning roles in this way allows Datadog to automatically discover and monitor all projects in the given scope, including any new projects that may be added to the group in the future.

{% collapsible-section #create-service-account %}
##### 1. Create a Google Cloud service account

1. Open your [Google Cloud console](https://console.cloud.google.com/).
1. Navigate to **IAM & Admin** > **Service Accounts**.
1. Click on **Create service account** at the top.
1. Give the service account a unique name, then click **Create and continue**.
1. Add the following roles to the service account:
   - [Monitoring Viewer](https://cloud.google.com/monitoring/access-control#monitoring_roles) provides **read-only** access to the monitoring data availabile in your Google Cloud environment
   - [Compute Viewer](https://cloud.google.com/compute/docs/access/iam#compute.viewer) provides **read-only** access to get and list Compute Engine resources
   - [Cloud Asset Viewer](https://cloud.google.com/iam/docs/understanding-roles#cloudasset.viewer) provides **read-only** access to cloud assets metadata
   - [Browser](https://cloud.google.com/resource-manager/docs/access-control-proj#browser) provides **read-only** access to discover accessible projects
1. Click **Continue**, then **Done** to complete creating the service account.

{% image
   source="https://docs.dd-static.net/images/integrations/google_cloud_platform/create-service-account.ec27127399862bac96b88a0a3828b4b8.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/integrations/google_cloud_platform/create-service-account.ec27127399862bac96b88a0a3828b4b8.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="Google Cloud console interface, showing the 'Create service account' flow. Under 'Grant this service account access to project', the four roles in the instructions are added." /%}

{% /collapsible-section %}

{% collapsible-section #add-principal-to-service-account %}
##### 2. Add the Datadog principal to your service account

1. In Datadog, navigate to the [**Integrations** > **Google Cloud Platform**](https://app.datadoghq.com/integrations/google-cloud-platform).

1. Click on **Add GCP Account**. If you have no configured projects, you are automatically redirected to this page.

1. If you have not generated a Datadog principal for your org, click the **Generate Principal** button.

1. Copy your Datadog principal and keep it for the next section.

   {% image
      source="https://docs.dd-static.net/images/integrations/google_cloud_platform/principal-2.de65e967af6ef672299441eb286356ce.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/integrations/google_cloud_platform/principal-2.de65e967af6ef672299441eb286356ce.png?auto=format&fit=max&w=850&dpr=2 2x"
      alt="Datadog interface, showing the 'Add New GCP Account' flow. The first step, 'Add Datadog Principal to Google,' features a text box where a user can generate a Datadog Principal and copy it to their clipboard. The second step, 'Add Service Account Email,' features a text box that the user can complete in section 3." /%}

**Note:** Keep this window open for the next section.

1. In [Google Cloud console](https://console.cloud.google.com/), under the **Service Accounts** menu, find the service account you created in the first section.

1. Go to the **Permissions** tab and click on **Grant Access**.

   {% image
      source="https://docs.dd-static.net/images/integrations/google_cloud_platform/grant-access.4ac9c4f78e350411355e5fdebe07dd27.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/integrations/google_cloud_platform/grant-access.4ac9c4f78e350411355e5fdebe07dd27.png?auto=format&fit=max&w=850&dpr=2 2x"
      alt="Google Cloud console interface, showing the Permissions tab under Service Accounts." /%}

1. Paste your Datadog principal into the **New principals** text box.

1. Assign the role of **Service Account Token Creator** and click **SAVE**.

   {% image
      source="https://docs.dd-static.net/images/integrations/google_cloud_platform/add-principals-blurred.942b38b5d0090b048093fe2047400586.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/integrations/google_cloud_platform/add-principals-blurred.942b38b5d0090b048093fe2047400586.png?auto=format&fit=max&w=850&dpr=2 2x"
      alt="Google Cloud console interface, showing an 'Add principals' box and an 'Assign roles' interface." /%}

**Note**: If you previously configured access using a shared Datadog principal, you can revoke the permission for that principal after you complete these steps.
{% /collapsible-section %}

{% collapsible-section %}
##### 3. Complete the integration setup in Datadog

1. In your Google Cloud console, navigate to the **Service Account** > **Details** tab. There, you can find the email associated with this Google service account. It resembles `<sa-name>@<project-id>.iam.gserviceaccount.com`.
1. Copy this email.
1. Return to the integration configuration tile in Datadog (where you copied your Datadog principal in the previous section).
1. In the box under **Add Service Account Email**, paste the email you previously copied.
1. Click on **Verify and Save Account**.

In approximately fifteen minutes, metrics appear in Datadog.
{% /collapsible-section %}

{% /tab %}

#### Validation{% #validation %}

To view your metrics, use the left menu to navigate to **Metrics** > **Summary** and search for `gcp`:

{% image
   source="https://docs.dd-static.net/images/integrations/google_cloud_platform/gcp_metric_summary.af08dc9779730588601ddec68f47eaf5.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/integrations/google_cloud_platform/gcp_metric_summary.af08dc9779730588601ddec68f47eaf5.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The Metric Summary page in Datadog filtered to metrics beginning with GCP" /%}

#### Configuration{% #configuration %}

{% collapsible-section %}
##### Limit metric collection by metric namespace, and by granular metric filters

Optionally, you can choose which Google Cloud services you monitor with Datadog, and further define which specific metrics you want to collect from each individual service. Configuring metric collection lets you optimize your Google Cloud Monitoring API costs, while retaining visibility into your critical services.

For granular control of metric collection from a service, define inclusion and exclusion filters for that service. A metric is collected only if it matches **at least one** inclusion filter, and does not match **any** exclusion filters.

Under the **Metric Collection** tab in Datadog's [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform), unselect the metric namespaces to exclude. To apply granular filtering for enabled services, click on the service in question and apply your filters in the `Add filters for gcp.<service>` field.

{% image
   source="https://docs.dd-static.net/images/integrations/google_cloud_platform/limit_metric_collection_2025-11-11.383906416aa505b6f53d8e9620b1f7ba.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/integrations/google_cloud_platform/limit_metric_collection_2025-11-11.383906416aa505b6f53d8e9620b1f7ba.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The metric collection tab in the Datadog Google Cloud integration page, with the AI Platform service expanded to display the Add filters for gcp.ml field" /%}

**Example filters**:

{% dl %}

{% dt %}
`subscription.*` `topic.*`
{% /dt %}

{% dd %}
Limit collection to metrics **matching either** `gcp.<service>.subscription.*` **or** `gcp.<service>.topic.*`
{% /dd %}

{% dt %}
`!*_cost` `!*_count`
{% /dt %}

{% dd %}
Limit collection to metrics **matching neither** `gcp.<service>.*_cost` **nor** `gcp.<service>.*_count`
{% /dd %}

{% dt %}
`snapshot.*` `!*_by_region`
{% /dt %}

{% dd %}
Limit collection to metrics **matching** `gcp.<service>.snapshot.*` **but not matching** `gcp.<service>.*_by_region`
{% /dd %}

{% /dl %}

{% /collapsible-section %}

{% collapsible-section %}
##### Limit metric collection by Google Cloud region and by global resources

Under the **Metric Collection** tab in Datadog's [Google Cloud integration page](https://docs.datadoghq.com/integrations/google-cloud-firestore/), deselect the region you want to exclude from metrics collection.

You can also specify additional locations not listed and disable any global metrics not associated with a region.

{% image
   source="https://docs.dd-static.net/images/integrations/google_cloud_platform/metric_region_filtering.865a0a4dcf05598afdb4888465e6a0b8.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/integrations/google_cloud_platform/metric_region_filtering.865a0a4dcf05598afdb4888465e6a0b8.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The metric collection tab in the Datadog Google Cloud integration page, with the Enable Global Metrics option highlighted and a subset of regions selected. The Additional Locations option is also highlighted with a multi-region filter defined" /%}

{% /collapsible-section %}

{% collapsible-section %}
##### Limit metric collection by host or Cloud Run instance

1. Assign a tag (such as `datadog:true`) to the hosts or Cloud Run instances you want to monitor with Datadog.
1. Under the **Metric Collection** tab in Datadog's [Google Cloud integration page](https://docs.datadoghq.com/integrations/google-cloud-firestore/), enter the tags in the **Limit Metric Collection Filters** textbox. Only hosts that match one of the defined tags are imported into Datadog. You can use wildcards (`?` for single character, `*` for multi-character) to match many hosts, or `!` to exclude certain hosts. This example includes all `c1*` sized instances, but excludes staging hosts:

```text
datadog:monitored,env:production,!env:staging,instance-type:c1.*
```

See Google's [Organize resources using labels](https://cloud.google.com/compute/docs/labeling-resources) page for more details.
{% /collapsible-section %}

#### Leveraging the Datadog Agent{% #leveraging-the-datadog-agent %}

Use the [Datadog Agent](https://docs.datadoghq.com/agent/) to collect the [most granular, low-latency metrics](https://docs.datadoghq.com/data_security/data_retention_periods/) from your infrastructure. Install the Agent on any host, including [GKE](https://docs.datadoghq.com/integrations/google-kubernetes-engine/), to get deeper insights from the [traces](https://docs.datadoghq.com/tracing/) and [logs](https://docs.datadoghq.com/logs/) it can collect. For more information, see [Why should I install the Datadog Agent on my cloud instances?](https://docs.datadoghq.com/agent/guide/why-should-i-install-the-agent-on-my-cloud-instances/)

## Log collection{% #log-collection %}

See the [Google Cloud Log Forwarding Setup page](https://docs.datadoghq.com/logs/guide/google-cloud-log-forwarding) for log forwarding setup options and instructions.

## Expanded BigQuery monitoring{% #expanded-bigquery-monitoring %}

{% callout %}
##### Join the Preview!

Expanded BigQuery monitoring is in Preview. Use this form to sign up to start gaining insights into your query performance.

[Request Access](https://www.datadoghq.com/product-preview/bigquery-monitoring/)
{% /callout %}

Expanded BigQuery monitoring provides granular visibility into your BigQuery environments.

### BigQuery jobs performance monitoring{% #bigquery-jobs-performance-monitoring %}

To monitor the performance of your BigQuery jobs, grant the [BigQuery Resource Viewer](https://cloud.google.com/bigquery/docs/access-control#bigquery.resourceViewer) role to the Datadog service account for each Google Cloud project.

**Notes**:

- You need to have verified your Google Cloud service account in Datadog, as outlined in the setup section.
- You do **not** need to set up Dataflow to collect logs for expanded BigQuery monitoring.

1. In the Google Cloud console, go to the [IAM page](https://console.cloud.google.com/iam-admin/).
1. Click **Grant access**.
1. Enter the email of your service account in **New principals**.
1. Assign the [BigQuery Resource Viewer](https://cloud.google.com/bigquery/docs/access-control#bigquery.resourceViewer) role.
1. Click **SAVE**.
1. In Datadog's [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform), click into the **BigQuery** tab.
1. Click the **Enable Query Performance** toggle.

### BigQuery data quality monitoring{% #bigquery-data-quality-monitoring %}

BigQuery data quality monitoring provides quality metrics from your BigQuery tables (such as freshness and updates to row count and size). Explore the data from your tables in depth on the [Data Quality Monitoring page](https://app.datadoghq.com/datasets/tables/explore).

To collect quality metrics, grant the [BigQuery Metadata Viewer](https://cloud.google.com/bigquery/docs/access-control#bigquery.metadataViewer) role to the Datadog Service Account for each BigQuery table you are using.

**Note**: BigQuery Metadata Viewer can be applied at a BigQuery table, dataset, project, or organization level.

- For Data Quality Monitoring of all tables within a dataset, grant access at the dataset level.
- For Data Quality Monitoring of all datasets within a project, grant access at the project level.

1. Navigate to [BigQuery](https://console.cloud.google.com/bigquery).
1. In the Explorer, search for the desired BigQuery resource.
1. Click the three-dot menu next to the resource, then click **Share -> Manage Permissions**.

{% image
   source="https://docs.dd-static.net/images/integrations/google_cloud_platform/bigquery_manage_permissions.3f95b0f1e2829e64cce41bfab70a793a.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/integrations/google_cloud_platform/bigquery_manage_permissions.3f95b0f1e2829e64cce41bfab70a793a.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The Manage Permissions menu option of a BigQuery dataset resource" /%}
Click **ADD PRINCIPAL**.In the new principals box, enter the Datadog service account set up for the Google Cloud integration.Assign the [BigQuery Metadata Viewer](https://cloud.google.com/bigquery/docs/access-control#bigquery.metadataViewer) role.Click **SAVE**.In Datadog's [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform), click into the **BigQuery** tab.Click the **Enable Data Quality** toggle.
### BigQuery jobs log retention{% #bigquery-jobs-log-retention %}

Datadog recommends setting up a new [logs index](https://app.datadoghq.com/logs/pipelines/indexes) called `data-observability-queries`, and indexing your BigQuery job logs for 15 days. Use the following index filter to pull in the logs:

```bash
service:data-observability @platform:*
```

See the [Log Management pricing page](https://www.datadoghq.com/pricing/?product=log-management#products) for cost estimation.

## Resource changes collection{% #resource-changes-collection %}

Select **Enable Resource Collection** in the [Resource Collection tab](https://app.datadoghq.com/integrations/google-cloud-platform?panel=resources) of the Google Cloud integration page. This allows you to receive resource events in Datadog when Google's [Cloud Asset Inventory](https://cloud.google.com/asset-inventory/docs/monitoring-asset-changes) detects changes in your cloud resources.

Then, follow the steps below to forward change events from a Pub/Sub topic to the Datadog [Event Explorer](https://app.datadoghq.com/event/explorer).

{% collapsible-section %}
#### Google Cloud CLI

### Create a Cloud Pub/Sub topic and subscription{% #create-a-cloud-pubsub-topic-and-subscription %}

#### Create a topic{% #create-a-topic %}

1. In the [Google Cloud Pub/Sub topics page](https://console.cloud.google.com/cloudpubsub/topicList), click **CREATE TOPIC**.
1. Give the topic a descriptive name.
1. **Uncheck** the option to add a default subscription.
1. Click **CREATE**.

#### Create a subscription{% #create-a-subscription %}

1. In the [Google Cloud Pub/Sub subscriptions page](https://console.cloud.google.com/cloudpubsub/subscription/), click **CREATE SUBSCRIPTION**.
1. Enter `export-asset-changes-to-datadog` for the subscription name.
1. Select the Cloud Pub/Sub topic previously created.
1. Select **Pull** as the delivery type.
1. Click **CREATE**.

### Grant access{% #grant-access %}

To read from this Pub/Sub subscription, the Google Cloud service account used by the integration needs the `pubsub.subscriptions.consume` permission for the subscription. A default role with minimal permissions that allows this is the **Pub/Sub subscriber** role. Follow the steps below to grant this role:

1. In the [Google Cloud Pub/Sub subscriptions page](https://console.cloud.google.com/cloudpubsub/subscription/), click the `export-asset-changes-to-datadog` subscription.
1. In the **info panel** on the right of the page, click the **Permissions** tab. If you don't see the info panel, click **SHOW INFO PANEL**.
1. Click **ADD PRINCIPAL**.
1. Enter the **service account email** used by the Datadog Google Cloud integration. You can find your service accounts listed on the left of the **Configuration** tab in the [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform) in Datadog.

### Create an asset feed{% #create-an-asset-feed %}

Run the command below in [Cloud Shell](https://cloud.google.com/shell) or the [gcloud CLI](https://cloud.google.com/sdk/gcloud) to create a Cloud Asset Inventory Feed that sends change events to the Pub/Sub topic created above.

{% tab title="Project" %}

```bash
gcloud asset feeds create <FEED_NAME>
--project=<PROJECT_ID>
--pubsub-topic=projects/<PROJECT_ID>/topics/<TOPIC_NAME>
--asset-names=<ASSET_NAMES>
--asset-types=<ASSET_TYPES>
--content-type=<CONTENT_TYPE>
```

Update the placeholder values as indicated:

- `<FEED_NAME>`: A descriptive name for the Cloud Asset Inventory Feed.
- `<PROJECT_ID>`: Your Google Cloud project ID.
- `<TOPIC_NAME>`: The name of the Pub/Sub topic linked with the `export-asset-changes-to-datadog` subscription.
- `<ASSET_NAMES>`: Comma-separated list of resource [full names](https://cloud.google.com/asset-inventory/docs/resource-name-format) to receive change events from. **Optional** if specifying `asset-types`.
- `<ASSET_TYPES>`: Comma-separated list of [asset types](https://cloud.google.com/asset-inventory/docs/supported-asset-types) to receive change events from. **Optional** if specifying `asset-names`.
- `<CONTENT_TYPE>`: **Optional** asset [content type](https://cloud.google.com/asset-inventory/docs/overview#content_types) to receive change events from.

{% /tab %}

{% tab title="Folder" %}

```bash
gcloud asset feeds create <FEED_NAME>
--folder=<FOLDER_ID>
--pubsub-topic=projects/<PROJECT_ID>/topics/<TOPIC_NAME>
--asset-names=<ASSET_NAMES>
--asset-types=<ASSET_TYPES>
--content-type=<CONTENT_TYPE>
```

Update the placeholder values as indicated:

- `<FEED_NAME>`: A descriptive name for the Cloud Asset Inventory Feed.
- `<FOLDER_ID>`: Your Google Cloud folder ID.
- `<TOPIC_NAME>`: The name of the Pub/Sub topic linked with the `export-asset-changes-to-datadog` subscription.
- `<ASSET_NAMES>`: Comma-separated list of resource [full names](https://cloud.google.com/asset-inventory/docs/resource-name-format) to receive change events from. **Optional** if specifying `asset-types`.
- `<ASSET_TYPES>`: Comma-separated list of [asset types](https://cloud.google.com/asset-inventory/docs/supported-asset-types) to receive change events from. **Optional** if specifying `asset-names`.
- `<CONTENT_TYPE>`: **Optional** asset [content type](https://cloud.google.com/asset-inventory/docs/overview#content_types) to receive change events from.

{% /tab %}

{% tab title="Organization" %}

```bash
gcloud asset feeds create <FEED_NAME>
--organization=<ORGANIZATION_ID>
--pubsub-topic=projects/<PROJECT_ID>/topics/<TOPIC_NAME>
--asset-names=<ASSET_NAMES>
--asset-types=<ASSET_TYPES>
--content-type=<CONTENT_TYPE>
```

Update the placeholder values as indicated:

- `<FEED_NAME>`: A descriptive name for the Cloud Asset Inventory Feed.
- `<ORGANIZATION_ID>`: Your Google Cloud organization ID.
- `<TOPIC_NAME>`: The name of the Pub/Sub topic linked with the `export-asset-changes-to-datadog` subscription.
- `<ASSET_NAMES>`: Comma-separated list of resource [full names](https://cloud.google.com/asset-inventory/docs/resource-name-format) to receive change events from. **Optional** if specifying `asset-types`.
- `<ASSET_TYPES>`: Comma-separated list of [asset types](https://cloud.google.com/asset-inventory/docs/supported-asset-types) to receive change events from. **Optional** if specifying `asset-names`.
- `<CONTENT_TYPE>`: **Optional** asset [content type](https://cloud.google.com/asset-inventory/docs/overview#content_types) to receive change events from.

{% /tab %}

{% /collapsible-section %}

{% collapsible-section %}
#### Terraform

### Create an asset feed{% #create-an-asset-feed-1 %}

Copy the following Terraform template and substitute the necessary arguments:

{% tab title="Project" %}

```h
locals {
  project_id = "<PROJECT_ID>"
}

resource "google_pubsub_topic" "pubsub_topic" {
  project = local.project_id
  name    = "<TOPIC_NAME>"
}

resource "google_pubsub_subscription" "pubsub_subscription" {
  project = local.project_id
  name    = "export-asset-changes-to-datadog"
  topic   = google_pubsub_topic.pubsub_topic.id
}

resource "google_pubsub_subscription_iam_member" "subscriber" {
  project      = local.project_id
  subscription = google_pubsub_subscription.pubsub_subscription.id
  role         = "roles/pubsub.subscriber"
  member       = "serviceAccount:<SERVICE_ACCOUNT_EMAIL>"
}

resource "google_cloud_asset_project_feed" "project_feed" {
  project      = local.project_id
  feed_id      = "<FEED_NAME>"
  content_type = "<CONTENT_TYPE>" # Optional. Remove if unused.

  asset_names = ["<ASSET_NAMES>"] # Optional if specifying asset_types. Remove if unused.
  asset_types = ["<ASSET_TYPES>"] # Optional if specifying asset_names. Remove if unused.

  feed_output_config {
    pubsub_destination {
      topic = google_pubsub_topic.pubsub_topic.id
    }
  }
}
```

Update the placeholder values as indicated:

- `<PROJECT_ID>`: Your Google Cloud project ID.
- `<TOPIC_NAME>`: The name of the Pub/Sub topic to be linked with the `export-asset-changes-to-datadog` subscription.
- `<SERVICE_ACCOUNT_EMAIL>`: The service account email used by the Datadog Google Cloud integration.
- `<FEED_NAME>`: A descriptive name for the Cloud Asset Inventory Feed.
- `<ASSET_NAMES>`: Comma-separated list of resource [full names](https://cloud.google.com/asset-inventory/docs/resource-name-format) to receive change events from. **Optional** if specifying `asset-types`.
- `<ASSET_TYPES>`: Comma-separated list of [asset types](https://cloud.google.com/asset-inventory/docs/supported-asset-types) to receive change events from. **Optional** if specifying `asset-names`.
- `<CONTENT_TYPE>`: **Optional** asset [content type](https://cloud.google.com/asset-inventory/docs/overview#content_types) to receive change events from.

{% /tab %}

{% tab title="Folder" %}

```h
locals {
  project_id = "<PROJECT_ID>"
}

resource "google_pubsub_topic" "pubsub_topic" {
  project = local.project_id
  name    = "<TOPIC_NAME>"
}

resource "google_pubsub_subscription" "pubsub_subscription" {
  project = local.project_id
  name    = "export-asset-changes-to-datadog"
  topic   = google_pubsub_topic.pubsub_topic.id
}

resource "google_pubsub_subscription_iam_member" "subscriber" {
  project      = local.project_id
  subscription = google_pubsub_subscription.pubsub_subscription.id
  role         = "roles/pubsub.subscriber"
  member       = "serviceAccount:<SERVICE_ACCOUNT_EMAIL>"
}

resource "google_cloud_asset_folder_feed" "folder_feed" {
  billing_project = local.project_id
  folder          = "<FOLDER_ID>"
  feed_id         = "<FEED_NAME>"
  content_type    = "<CONTENT_TYPE>" # Optional. Remove if unused.

  asset_names = ["<ASSET_NAMES>"] # Optional if specifying asset_types. Remove if unused.
  asset_types = ["<ASSET_TYPES>"] # Optional if specifying asset_names. Remove if unused.

  feed_output_config {
    pubsub_destination {
      topic = google_pubsub_topic.pubsub_topic.id
    }
  }
}
```

Update the placeholder values as indicated:

- `<PROJECT_ID>`: Your Google Cloud project ID.
- `<FOLDER_ID>`: The ID of the folder this feed should be created in.
- `<TOPIC_NAME>`: The name of the Pub/Sub topic to be linked with the `export-asset-changes-to-datadog` subscription.
- `<SERVICE_ACCOUNT_EMAIL>`: The service account email used by the Datadog Google Cloud integration.
- `<FEED_NAME>`: A descriptive name for the Cloud Asset Inventory Feed.
- `<ASSET_NAMES>`: Comma-separated list of resource [full names](https://cloud.google.com/asset-inventory/docs/resource-name-format) to receive change events from. **Optional** if specifying `asset-types`.
- `<ASSET_TYPES>`: Comma-separated list of [asset types](https://cloud.google.com/asset-inventory/docs/supported-asset-types) to receive change events from. **Optional** if specifying `asset-names`.
- `<CONTENT_TYPE>`: **Optional** asset [content type](https://cloud.google.com/asset-inventory/docs/overview#content_types) to receive change events from.

{% /tab %}

{% tab title="Organization" %}

```h
locals {
  project_id = "<PROJECT_ID>"
}

resource "google_pubsub_topic" "pubsub_topic" {
  project = local.project_id
  name    = "<TOPIC_NAME>"
}

resource "google_pubsub_subscription" "pubsub_subscription" {
  project = local.project_id
  name    = "export-asset-changes-to-datadog"
  topic   = google_pubsub_topic.pubsub_topic.id
}

resource "google_pubsub_subscription_iam_member" "subscriber" {
  project      = local.project_id
  subscription = google_pubsub_subscription.pubsub_subscription.id
  role         = "roles/pubsub.subscriber"
  member       = "serviceAccount:<SERVICE_ACCOUNT_EMAIL>"
}

resource "google_cloud_asset_organization_feed" "organization_feed" {
  billing_project = local.project_id
  org_id          = "<ORGANIZATION_ID>"
  feed_id         = "<FEED_NAME>"
  content_type    = "<CONTENT_TYPE>" # Optional. Remove if unused.

  asset_names = ["<ASSET_NAMES>"] # Optional if specifying asset_types. Remove if unused.
  asset_types = ["<ASSET_TYPES>"] # Optional if specifying asset_names. Remove if unused.

  feed_output_config {
    pubsub_destination {
      topic = google_pubsub_topic.pubsub_topic.id
    }
  }
}
```

Update the placeholder values as indicated:

- `<PROJECT_ID>`: Your Google Cloud project ID.
- `<TOPIC_NAME>`: The name of the Pub/Sub topic to be linked with the `export-asset-changes-to-datadog` subscription.
- `<SERVICE_ACCOUNT_EMAIL>`: The service account email used by the Datadog Google Cloud integration.
- `<ORGANIZATION_ID>`: Your Google Cloud organization ID.
- `<FEED_NAME>`: A descriptive name for the Cloud Asset Inventory Feed.
- `<ASSET_NAMES>`: Comma-separated list of resource [full names](https://cloud.google.com/asset-inventory/docs/resource-name-format) to receive change events from. **Optional** if specifying `asset-types`.
- `<ASSET_TYPES>`: Comma-separated list of [asset types](https://cloud.google.com/asset-inventory/docs/supported-asset-types) to receive change events from. **Optional** if specifying `asset-names`.
- `<CONTENT_TYPE>`: **Optional** asset [content type](https://cloud.google.com/asset-inventory/docs/overview#content_types) to receive change events from.

{% /tab %}

{% /collapsible-section %}

Datadog recommends setting the `asset-types` parameter to the regular expression `.*` to collect changes for all resources.

**Note**: You must specify at least one value for either the `asset-names` or `asset-types` parameter.

See the [gcloud asset feeds create](https://cloud.google.com/sdk/gcloud/reference/asset/feeds/create) reference for the full list of configurable parameters.

### Enable resource changes collection{% #enable-resource-changes-collection %}

Click to **Enable Resource Changes Collection** in the [Resource Collection tab](https://app.datadoghq.com/integrations/google-cloud-platform?panel=resources) of the Google Cloud integration page.

{% image
   source="https://docs.dd-static.net/images/integrations/google_cloud_platform/enable_resource_change_collection.b44fdc22a80e6eb2c8d2623c108038dd.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/integrations/google_cloud_platform/enable_resource_change_collection.b44fdc22a80e6eb2c8d2623c108038dd.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The Enable Resource Changes Collection toggle in Datadog's Google Cloud integration tile" /%}

#### Validation{% #validation-1 %}

Find your asset change events in the [Datadog Event Explorer](https://app.datadoghq.com/event/explorer?query=source%3Agoogle_cloud_asset_inventory).

## Private Service Connect{% #private-service-connect %}

{% callout %}
# Important note for users on the following Datadog sites: app.datadoghq.com, us3.datadoghq.com, ap1.datadoghq.com, app.ddog-gov.com



{% alert level="info" %}
Private Service Connect is only available for the US5 and EU Datadog sites.
{% /alert %}


{% /callout %}

Use the [Google Cloud Private Service Connect integration](https://docs.datadoghq.com/integrations/google-cloud-private-service-connect/) to visualize connections, data transferred, and dropped packets through Private Service Connect. This gives you visibility into important metrics from your Private Service Connect connections, both for producers as well as consumers. [Private Service Connect (PSC)](https://cloud.google.com/vpc/docs/private-service-connect) is a Google Cloud networking product that enables you to access [Google Cloud services](https://cloud.google.com/vpc/docs/private-service-connect-compatibility#google-services), [third-party partner services](https://cloud.google.com/vpc/docs/private-service-connect-compatibility#third-party-services), and company-owned applications directly from your Virtual Private Cloud (VPC).

See [Access Datadog privately and monitor your Google Cloud Private Service Connect usage](https://www.datadoghq.com/blog/google-cloud-private-service-connect/) in the Datadog blog for more information.

## Data Collected{% #data-collected %}

### Metrics{% #metrics %}

|  |
|  |
| **gcp.gce.instance.cpu.utilization**(gauge) | Fraction of the allocated CPU that is currently in use on the instance. Note that some machine types allow bursting above 100% usage.*Shown as fraction* |

#### Cumulative metrics{% #cumulative-metrics %}

Cumulative metrics are imported into Datadog with a `.delta` metric for each metric name. A cumulative metric is a metric where the value constantly increases over time. For example, a metric for `sent bytes` might be cumulative. Each value records the total number of bytes sent by a service at that time. The delta value represents the change since the previous measurement.

For example:

`gcp.gke.container.restart_count` is a CUMULATIVE metric. While importing this metric as a cumulative metric, Datadog adds the `gcp.gke.container.restart_count.delta` metric which includes the delta values (as opposed to the aggregate value emitted as part of the CUMULATIVE metric). See [Google Cloud metric kinds](https://cloud.google.com/monitoring/api/v3/kinds-and-types) for more information.

### Events{% #events %}

All service events generated by your Google Cloud Platform are forwarded to your [Datadog Events Explorer](https://app.datadoghq.com/event/stream).

### Service Checks{% #service-checks %}

The Google Cloud Platform integration does not include any service checks.

### Tags{% #tags %}

Tags are automatically assigned based on a variety of Google Cloud Platform and Google Compute Engine configuration options. The `project_id` tag is added to all metrics. Additional tags are collected from the Google Cloud Platform when available, and varies based on metric type.

Additionally, Datadog collects the following as tags:

- Any hosts with `<key>:<value>` labels.
- Custom labels from Google Pub/Sub, GCE, Cloud SQL, and Cloud Storage.

## Troubleshooting{% #troubleshooting %}

### Incorrect metadata for user defined *gcp.logging* metrics?{% #incorrect-metadata-for-user-defined-gcplogging-metrics %}

For non-standard *gcp.logging* metrics, such as metrics beyond [Datadog's out of the box logging metrics](https://docs.datadoghq.com/integrations/google-stackdriver-logging/#metrics), the metadata applied may not be consistent with Google Cloud Logging.

In these cases, the metadata should be manually set by navigating to the [metric summary page](https://app.datadoghq.com/metric/summary), searching and selecting the metric in question, and clicking the pencil icon next to the metadata.

Need help? Contact [Datadog support](https://docs.datadoghq.com/help/).

## Further Reading{% #further-reading %}

- [Improve the compliance and security posture of your Google Cloud environment with Datadog](https://www.datadoghq.com/blog/cspm-for-gcp-with-datadog/)
- [Monitor Google Cloud Vertex AI with Datadog](https://www.datadoghq.com/blog/google-cloud-vertex-ai-monitoring-datadog/)
- [Monitor your Dataflow pipelines with Datadog](https://www.datadoghq.com/blog/monitor-dataflow-pipelines-with-datadog/)
- [Google Cloud Platform](https://registry.terraform.io/providers/DataDog/datadog/latest/docs/resources/integration_gcp_sts)
- [Monitor BigQuery with Datadog](https://www.datadoghq.com/blog/track-bigquery-costs-performance/)
- [Google Cloud Platform blog](https://www.datadoghq.com/blog/recent-changes-tab/)
