---
title: Getting Started with Google Cloud
description: >-
  Set up comprehensive monitoring for your Google Cloud environment. Configure
  service accounts, enable metric collection, and explore log forwarding and
  Agent installation.
breadcrumbs: >-
  Docs > Getting Started > Introduction to Integrations > Getting Started with
  Google Cloud
---

# Getting Started with Google Cloud

## Overview{% #overview %}

Use this guide to get started with monitoring your Google Cloud environment. This approach simplifies the setup for Google Cloud environments with multiple projects, allowing you to maximize your monitoring coverage.

## Setup{% #setup %}

### Prerequisites{% #prerequisites %}

1. Create a [Datadog account](https://www.datadoghq.com/)
1. Set up a [Service Account](https://cloud.google.com/iam/docs/service-accounts-create) in any of your Google Cloud projects
1. Review these Google Cloud Prerequisites:

{% callout %}
# Important note for users on the following Datadog sites: app.datadoghq.com, us3.datadoghq.com, us5.datadoghq.com, app.datadoghq.eu, ap1.datadoghq.com, ap2.datadoghq.com

● If your organization restricts identities by domain, you must add Datadog's customer identity `C0147pk0i` as an allowed value in your policy.
{% /callout %}

● The Google Cloud integration requires the below APIs to be enabled **for each of the projects** you want to monitor:

{% alert level="danger" %}
Ensure that any projects being monitored are not configured as [scoping projects](https://cloud.google.com/monitoring/settings#:~:text=A%20scoping%20project%20hosts%20a,is%20also%20a%20scoping%20project.) that pull in metrics from multiple other projects.
{% /alert %}

{% dl %}

{% dt %}
[Cloud Monitoring API](https://console.cloud.google.com/apis/library/monitoring.googleapis.com)
{% /dt %}

{% dd %}
Allows Datadog to query your Google Cloud metric data.
{% /dd %}

{% dt %}
[Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com)
{% /dt %}

{% dd %}
Allows Datadog to discover compute instance data.
{% /dd %}

{% dt %}
[Cloud Asset API](https://console.cloud.google.com/apis/library/cloudasset.googleapis.com)
{% /dt %}

{% dd %}
Allows Datadog to request Google Cloud resources and link relevant labels to metrics as tags.
{% /dd %}

{% dt %}
[Cloud Resource Manager API](https://console.cloud.google.com/apis/library/cloudresourcemanager.googleapis.com)
{% /dt %}

{% dd %}
Allows Datadog to append metrics with the correct resources and tags.
{% /dd %}

{% dt %}
[IAM API](https://console.cloud.google.com/apis/library/iam.googleapis.com)
{% /dt %}

{% dd %}
Allows Datadog to authenticate with Google Cloud.
{% /dd %}

{% dt %}
[Cloud Billing API](https://console.cloud.google.com/apis/library/cloudbilling.googleapis.com)
{% /dt %}

{% dd %}
Allows developers to manage billing for their Google Cloud Platform projects programmatically. See the Cloud Cost Management (CCM) section for more information.
{% /dd %}

{% /dl %}

{% alert level="info" %}
You can confirm if these APIs are enabled by going to [Enabled APIs & Services](https://console.cloud.google.com/apis/dashboard).
{% /alert %}

### Metric collection{% #metric-collection %}

{% alert level="info" %}
If your Google Cloud organization uses [VPC Service Controls](https://cloud.google.com/vpc-service-controls/docs/overview), you must explicitly allow Datadog service accounts to access protected resources. If these service accounts are not permitted within your service perimeter, metric, resource, and metadata collection may fail. Contact [Datadog Support](https://docs.datadoghq.com/help/) for the service account identifiers for your site or region.
{% /alert %}

{% tab title="Org-level" %}
Organization-level monitoring is recommended for comprehensive coverage of all projects, including any future projects that may be created in an org.

**Note**: Your [Google Cloud Identity](https://cloud.google.com/identity/docs/overview) user account must have the `Admin` role assigned to it at the desired scope to complete the setup in Google Cloud (for example, `Organization Admin`).

{% collapsible-section %}
##### 1. Create a Google Cloud service account in the default project

1. Open your [Google Cloud console](https://console.cloud.google.com/).
1. Navigate to **IAM & Admin** > **Service Accounts**.
1. Click **Create service account** at the top.
1. Give the service account a unique name.
1. Click **Done** to complete creating the service account.

{% /collapsible-section %}

{% collapsible-section %}
##### 2. Add the service account at the organization or folder level

1. In the Google Cloud console, go to the **IAM** page.
1. Select a folder or organization.
1. To grant a role to a principal that does not already have other roles on the resource, click **Grant Access**, then enter the email of the service account you created earlier.
1. Enter the service account's email address.
1. Assign the following roles:
   - [Compute Viewer](https://cloud.google.com/compute/docs/access/iam#compute.viewer) provides **read-only** access to get and list Compute Engine resources
   - [Monitoring Viewer](https://cloud.google.com/monitoring/access-control#monitoring_roles) provides **read-only** access to the monitoring data availabile in your Google Cloud environment
   - [Cloud Asset Viewer](https://cloud.google.com/iam/docs/understanding-roles#cloudasset.viewer) provides **read-only** access to cloud assets metadata
   - [Browser](https://cloud.google.com/resource-manager/docs/access-control-proj#browser) provides **read-only** access to browse the hierarchy of a project
   - [Service Usage Consumer](https://cloud.google.com/service-usage/docs/access-control#serviceusage.serviceUsageConsumer) (**optional**, for multi-project environments) provides per-project cost and API quota attribution
1. Click **Save**.

**Note**: The `Browser` role is only required in the default project of the service account. Other projects require only the other listed roles.
{% /collapsible-section %}

{% collapsible-section %}
##### 3. Add the Datadog principal to your service account

**Note**: If you previously configured access using a shared Datadog principal, you can revoke the permission for that principal after you complete these steps.

1. In Datadog, navigate to **Integrations** > [**Google Cloud Platform**](https://app.datadoghq.com/integrations/google-cloud-platform).
1. Click **Add Google Cloud Account**. If you have no configured projects, you are automatically redirected to this page.
1. Copy your Datadog principal and keep it for the next section.

{% image
   source="https://datadog-docs.imgix.net/images/integrations/google_cloud_platform/principal-2.de65e967af6ef672299441eb286356ce.png?auto=format"
   alt="The page for adding a new Google Cloud account in Datadog's Google Cloud integration tile" /%}

**Note**: Keep this window open for Section 4.
In the [Google Cloud console](https://console.cloud.google.com/), under the **Service Accounts** menu, find the service account you created in Section 1.Go to the **Permissions** tab and click **Grant Access**.
{% image
   source="https://datadog-docs.imgix.net/images/integrations/google_cloud_platform/grant-access.4ac9c4f78e350411355e5fdebe07dd27.png?auto=format"
   alt="Google Cloud console interface, showing the Permissions tab under Service Accounts." /%}
Paste your Datadog principal into the **New principals** text box.Assign the role of **Service Account Token Creator**.Click **Save**.
{% /collapsible-section %}

{% collapsible-section %}
##### 4. Complete the integration setup in Datadog

1. In your Google Cloud console, navigate to the **Service Account** > **Details** tab. On this page, find the email associated with this Google service account. It has the format `<SA_NAME>@<PROJECT_ID>.iam.gserviceaccount.com`.
1. Copy this email.
1. Return to the integration configuration tile in Datadog (where you copied your Datadog principal in the previous section).
1. Paste the email you copied in **Add Service Account Email**.
1. Click **Verify and Save Account**.

{% /collapsible-section %}

Metrics appear in Datadog approximately **15 minutes** after setup.
{% /tab %}

{% tab title="Project- and Folder-level" %}

{% collapsible-section #quickstart-setup %}
#### Quick Start (recommended)

### Prerequisites{% #prerequisites %}

To use the Quick Start method, your Datadog user role must be able to create API and application keys. If you're using a [Datadog-managed role](https://docs.datadoghq.com/account_management/rbac/permissions/#managed-roles), you must have the **Datadog Admin role**. If you're using a [custom role](https://docs.datadoghq.com/account_management/rbac/permissions/#custom-roles), your role needs to have at least the `api_keys_write` and `user_app_keys` permissions.

### Choose Quick Start setup if…{% #choose-quick-start-setup-if %}

- You are setting up the Google Cloud integration for the first time.
- You prefer a UI-based workflow, and want to minimize the time it takes to create a service account with the required monitoring permissions.
- You want to automate setup steps in scripts or CI/CD pipelines.

### Instructions{% #instructions %}

1. In the [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform), select **+ Add GCP Account**.
1. Click **Quick Start**.
1. Click **Copy** in the setup script section.**Note**: Datadog recommends running this script locally through the [gcloud CLI](https://cloud.google.com/sdk/docs/install), as it may be faster. This requires having your Google Cloud credentials available locally, and the gcloud CLI installed on your machine.
1. Click **Open Google Cloud Shell**, or go to [Google Cloud Shell](https://ssh.cloud.google.com/cloudshell).
1. Paste the script into the shell prompt and run it.
1. Select any folders and projects to be monitored. You can only see projects and folders that you have the required access and permissions for.
1. Under **Provide Service Account Details**:
   1. Give the service account a name.
   1. Select the project to contain the service account.
1. Configure **Metric Collection** (optional).
   1. Choose whether to disable the option for silencing monitors for expected GCE instance shutdowns and autoscaling events.
   1. Choose whether to apply tags to the metrics associated with the created service account.
   1. Choose whether to disable metric collection for specific Google Cloud services to help control Google Cloud Monitoring costs.
   1. Choose whether to apply granular metric filters for any Google Cloud services enabled for metric collection.
   1. Choose whether to filter metrics by tags for GCP resource types `Cloud Run Revision`, `VM Instance`, or `Cloud Function` to help control Datadog costs. **Note**: `VM Instance` filtering does not impact related `gcp.logging.*` metrics and does not cause any billing impact for those metrics.
1. Configure **Resource Collection** (attributes and configuration information of the resources in your Google Cloud environment, optional).
1. A summary of the changes to be made is displayed. If confirmed, the script:
   - Enables the required APIs
   - Assigns the necessary permissions to monitor each selected project and folder
   - Completes the integration setup in Datadog

{% /collapsible-section %}

{% collapsible-section #terraform-setup %}
#### Terraform

### Choose Terraform setup if…{% #choose-terraform-setup-if %}

- You manage infrastructure as code and want to keep the Datadog Google Cloud integration under version control.
- You need to configure multiple folders or projects consistently with reusable provider blocks.
- You want a repeatable, auditable deployment process that fits into your Terraform-managed environment.

### Instructions{% #instructions %}

1. In the [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform), select **+ Add GCP Account**.
1. Select **Terraform**.
1. Under **Provide GCP Resources**, add any project IDs and folder IDs to be monitored.
1. Select any folders and projects to be monitored.
1. Under **Provide Service Account Details**:
   1. Give the service account a name.
   1. Select the project to contain the service account.
1. Configure **Metric Collection** (optional).
   1. Choose whether to disable the option for silencing monitors for expected GCE instance shutdowns and autoscaling events.
   1. Choose whether to apply tags to the metrics associated with the created service account.
   1. Choose whether to disable metric collection for specific Google Cloud services to help control Google Cloud Monitoring costs.
   1. Choose whether to apply granular metric filters for any Google Cloud services enabled for metric collection.
   1. Choose whether to filter metrics by tags for GCP resource types `Cloud Run Revision`, `VM Instance`, or `Cloud Function` to help control Datadog costs.
1. Configure **Resource Collection** (attributes and configuration information of the resources in your Google Cloud environment).
1. Copy the provided **Terraform Code**.
1. Paste the code into a `.tf` file, and run the **Initialize and apply the Terraform** command. If successful, the command:
   - Enables the required APIs
   - Assigns the necessary permissions to monitor each selected project and folder
   - Completes the integration setup in Datadog

{% /collapsible-section %}

{% collapsible-section #manual-setup %}
#### Manual

### Choose manual setup if…{% #choose-manual-setup-if %}

- You need to set up access manually for a smaller number of projects or folders.
- You want more step-by-step control over assigning permissions and credentials within the GCP UI.

### Instructions{% #instructions %}

1. In the [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform), select **+ Add GCP Account**.
1. Click **Manual**.
1. Copy the **Datadog Principal** value, and click **Open the Google Console**.
1. Create a service account:
   1. Give the service account a descriptive name, and click **Create and continue**.
   1. Under **Permissions**, search for and add the **Service Account Token Creator** role from the dropdown, and click **Continue**.
   1. Under **Principals with access**, paste the **Datadog Principal** value into the **Service account users role** field, and click **Done**.
1. Click the service account link under the **Email** column.
1. Copy the **Email** value.
1. In Datadog, paste the service account email in the **Add Service Account Email** section.
1. Configure **Metric Collection** (optional).
   1. Choose whether to disable the option for silencing monitors for expected GCE instance shutdowns and autoscaling events.
   1. Choose whether to apply tags to the metrics associated with the created service account.
   1. Choose whether to disable metric collection for specific Google Cloud services to help control Google Cloud Monitoring costs.
   1. Choose whether to apply granular metric filters for any Google Cloud services enabled for metric collection.
   1. Choose whether to filter metrics by tags for GCP resource types `Cloud Run Revision`, `VM Instance`, or `Cloud Function` to help control Datadog costs.
1. Configure **Resource Collection** (attributes and configuration information of the resources in your Google Cloud environment, optional).
1. Click **Verify and Save Account**.

{% /collapsible-section %}

{% /tab %}

#### Validation{% #validation %}

To view your metrics, use the left menu to navigate to **Metrics** > **Summary** and search for `gcp`:

{% image
   source="https://datadog-docs.imgix.net/images/integrations/google_cloud_platform/gcp_metric_summary.af08dc9779730588601ddec68f47eaf5.png?auto=format"
   alt="The Metric Summary page in Datadog filtered to metrics beginning with GCP" /%}

### Google Cloud integrations{% #google-cloud-integrations %}

The Google Cloud integration collects all available [Google Cloud metrics](https://cloud.google.com/monitoring/api/metrics_gcp) from your projects through the Google Cloud Monitoring API. Integrations are installed automatically when Datadog recognizes data being ingested from your Google Cloud account, such as BigQuery.

{% collapsible-section %}
##### See the Google Cloud integrations Datadog collects metrics from

| Integration                                                                                                    | Description                                                                           |
| -------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------- |
| [App Engine](https://docs.datadoghq.com/integrations/google_app_engine/)                                       | PaaS (platform as a service) to build scalable applications                           |
| [Big Query](https://docs.datadoghq.com/integrations/google_cloud_bigquery/)                                    | Enterprise data warehouse                                                             |
| [Bigtable](https://docs.datadoghq.com/integrations/google_cloud_bigtable/)                                     | NoSQL Big Data database service                                                       |
| [Cloud SQL](https://docs.datadoghq.com/integrations/google_cloudsql/)                                          | MySQL database service                                                                |
| [Cloud APIs](https://docs.datadoghq.com/integrations/google_cloud_apis/)                                       | Programmatic interfaces for all Google Cloud Platform services                        |
| [Cloud Armor](https://docs.datadoghq.com/integrations/google_cloud_armor/)                                     | Network security service to help protect against denial of service and web attacks    |
| [Cloud Composer](https://docs.datadoghq.com/integrations/google_cloud_composer/)                               | A fully managed workflow orchestration service                                        |
| [Cloud Dataproc](https://docs.datadoghq.com/integrations/google_cloud_dataproc/)                               | A cloud service for running Apache Spark and Apache Hadoop clusters                   |
| [Cloud Dataflow](https://docs.datadoghq.com/integrations/google_cloud_dataflow/)                               | A fully-managed service for transforming and enriching data in stream and batch modes |
| [Cloud Filestore](https://docs.datadoghq.com/integrations/google_cloud_filestore/)                             | High-performance, fully managed file storage                                          |
| [Cloud Firestore](https://docs.datadoghq.com/integrations/google_cloud_firestore/)                             | A flexible, scalable database for mobile, web, and server development                 |
| [Cloud Interconnect](https://docs.datadoghq.com/integrations/google_cloud_interconnect/)                       | Hybrid connectivity                                                                   |
| [Cloud IoT](https://docs.datadoghq.com/integrations/google_cloud_iot/)                                         | Secure device connection and management                                               |
| [Cloud Load Balancing](https://docs.datadoghq.com/integrations/google_cloud_loadbalancing/)                    | Distribute load-balanced compute resources                                            |
| [Cloud Logging](https://docs.datadoghq.com/integrations/google_stackdriver_logging/)                           | Real-time log management and analysis                                                 |
| [Cloud Memorystore for Redis](https://docs.datadoghq.com/integrations/google_cloud_redis/)                     | A fully managed in-memory data store service                                          |
| [Cloud Router](https://docs.datadoghq.com/integrations/google_cloud_router/)                                   | Exchange routes between your VPC and on-premises networks by using BGP                |
| [Cloud Run](https://docs.datadoghq.com/integrations/google_cloud_run/)                                         | Managed compute platform that runs stateless containers through HTTP                  |
| [Cloud Security Command Center](https://docs.datadoghq.com/integrations/google_cloud_security_command_center/) | Security Command Center is a threat reporting service.                                |
| [Cloud Tasks](https://docs.datadoghq.com/integrations/google_cloud_tasks/)                                     | Distributed task queues                                                               |
| [Cloud TPU](https://docs.datadoghq.com/integrations/google_cloud_tpu/)                                         | Train and run machine learning models                                                 |
| [Compute Engine](https://docs.datadoghq.com/integrations/google_compute_engine/)                               | High performance virtual machines                                                     |
| [Container Engine](https://docs.datadoghq.com/integrations/google_container_engine/)                           | Kubernetes, managed by google                                                         |
| [Datastore](https://docs.datadoghq.com/integrations/google_cloud_datastore/)                                   | NoSQL database                                                                        |
| [Firebase](https://docs.datadoghq.com/integrations/google_cloud_firebase/)                                     | Mobile platform for application development                                           |
| [Functions](https://docs.datadoghq.com/integrations/google_cloud_functions/)                                   | Serverless platform for building event-based microservices                            |
| [Kubernetes Engine](https://docs.datadoghq.com/integrations/google_kubernetes_engine/)                         | Cluster manager and orchestration system                                              |
| [Machine Learning](https://docs.datadoghq.com/integrations/google_cloud_ml/)                                   | Machine learning services                                                             |
| [Private Service Connect](https://docs.datadoghq.com/integrations/google_cloud_private_service_connect/)       | Access managed services with private VPC connections                                  |
| [Pub/Sub](https://docs.datadoghq.com/integrations/google_cloud_pubsub/)                                        | Real-time messaging service                                                           |
| [Spanner](https://docs.datadoghq.com/integrations/google_cloud_spanner/)                                       | Horizontally scalable, globally consistent, relational database service               |
| [Storage](https://docs.datadoghq.com/integrations/google_cloud_storage/)                                       | Unified object storage                                                                |
| [Vertex AI](https://docs.datadoghq.com/integrations/google_cloud_vertex_ai/)                                   | Build, train and deploy custom machine learning (ML) models.                          |
| [VPN](https://docs.datadoghq.com/integrations/google_cloud_vpn/)                                               | Managed network functionality                                                         |

{% /collapsible-section %}

For deep dives into monitoring many of the more popular services, check out the blogs linked below.

{% collapsible-section %}
##### Integration blogs

{% dl %}

{% dt %}
[Cloud Armor](https://www.datadoghq.com/blog/network-attacks-google-cloud-armor/)
{% /dt %}

{% dd %}
Google Cloud Armor is a network security service protecting against DDoS and application attacks.
{% /dd %}

{% dt %}
[BigQuery](https://www.datadoghq.com/blog/track-bigquery-costs-performance/)
{% /dt %}

{% dd %}
BigQuery is a serverless and multi-cloud data warehouse that can provide you with valuable insights from your business data.
{% /dd %}

{% dt %}
[Cloud Run](https://www.datadoghq.com/blog/collect-traces-logs-from-cloud-run-with-datadog/)
{% /dt %}

{% dd %}
Cloud Run is a fully-managed platform that lets you run your code directly on scalable infrastructure in Google Cloud.
{% /dd %}

{% dt %}
[Cloud SQL](https://www.datadoghq.com/blog/monitor-google-cloud-sql/)
{% /dt %}

{% dd %}
Cloud SQL is a fully-managed relational database service that works with MySQL, PostgreSQL, and SQL Server.
{% /dd %}

{% dt %}
[Compute Engine](https://www.datadoghq.com/blog/monitor-google-compute-engine-with-datadog/)
{% /dt %}

{% dd %}
Compute Engine is a computing and hosting service that provides you with the ability to create and run virtual machines in Google Cloud.
{% /dd %}

{% dt %}
[Dataflow](https://www.datadoghq.com/blog/monitor-dataflow-pipelines-with-datadog/)
{% /dt %}

{% dd %}
Dataflow is a fully-managed streaming analytics service that uses autoscaling and real-time data processing.
{% /dd %}

{% dt %}
[Eventarc](https://www.datadoghq.com/blog/incident-response-eventarc-datadog/)
{% /dt %}

{% dd %}
Eventarc is a fully-managed service enabling you to build event-driven architectures.
{% /dd %}

{% dt %}
[Google Kubernetes Engine (GKE)](https://www.datadoghq.com/blog/monitor-google-kubernetes-engine/)
{% /dt %}

{% dd %}
GKE is a fully-managed Kubernetes service.
{% /dd %}

{% dt %}
[Private Service Connect](https://www.datadoghq.com/blog/google-cloud-private-service-connect/)
{% /dt %}

{% dd %}
Private Service Connect lets you access managed Google services privately from within your VPC network.
{% /dd %}

{% dt %}
[Security Command Center](https://www.datadoghq.com/blog/datadog-google-security-command-center/)
{% /dt %}

{% dd %}
Security Command Center provides posture management and threat detection for code, identities, and data.
{% /dd %}

{% dt %}
[Vertex AI](https://www.datadoghq.com/blog/google-cloud-vertex-ai-monitoring-datadog/)
{% /dt %}

{% dd %}
Vertex AI is a fully-managed generative AI development platform.
{% /dd %}

{% /dl %}

{% /collapsible-section %}

### Limit metric collection filters{% #limit-metric-collection-filters %}

You can choose which services and resources to collect metrics from. This can help control costs by reducing the number of API calls made on your behalf.

{% collapsible-section %}
#### Limit metric collection by Google Cloud service, and by granular metric filters

Under the **Metric Collection** tab in Datadog's [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform), deselect the metric namespaces to exclude.

To apply granular metric filtering for enabled services, click on the service in question and apply your filters in the `Add filters for gcp.<service>` field.

{% image
   source="https://datadog-docs.imgix.net/images/integrations/google_cloud_platform/limit_metric_collection_2025-11-11.383906416aa505b6f53d8e9620b1f7ba.png?auto=format"
   alt="The metric collection tab in the Datadog Google Cloud integration page, with the AI Platform service expanded to display the Add filters for gcp.ml field" /%}

**Example filters**:

{% dl %}

{% dt %}
`subscription.*` `topic.*`
{% /dt %}

{% dd %}
Limit collection to metrics **matching either** `gcp.<service>.subscription.*` **or** `gcp.<service>.topic.*`
{% /dd %}

{% dt %}
`!*_cost` `!*_count`
{% /dt %}

{% dd %}
Limit collection to metrics **matching neither** `gcp.<service>.*_cost` **nor** `gcp.<service>.*_count`
{% /dd %}

{% dt %}
`snapshot.*` `!*_by_region`
{% /dt %}

{% dd %}
Limit collection to metrics **matching** `gcp.<service>.snapshot.*` **but not matching** `gcp.<service>.*_by_region`
{% /dd %}

{% /dl %}

{% /collapsible-section %}

{% collapsible-section %}
#### Limit metric collection by Google Cloud region, and by global resources

Under the **Metric Collection** tab in Datadog's [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform), deselect which regions to exclude from metrics collection.

You can also specify additional locations not listed and disable any global metrics not associated with a region.

{% image
   source="https://datadog-docs.imgix.net/images/integrations/google_cloud_platform/metric_region_filtering.865a0a4dcf05598afdb4888465e6a0b8.png?auto=format"
   alt="The metric collection tab in the Datadog Google Cloud integration page, with the Enable Global Metrics option highlighted and a subset of regions selected. The Additional Locations option is also highlighted with a multi-region filter defined" /%}

{% /collapsible-section %}

{% collapsible-section %}
#### Limit metric collection by host or Cloud Run instance

1. Assign a tag (such as `datadog:true`) to the hosts or Cloud Run instances you want to monitor with Datadog.
1. Under the **Metric Collection** tab in Datadog's [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform), enter the tags in the **Limit Metric Collection Filters** textbox. Only hosts that match one of the defined tags are imported into Datadog. You can use wildcards (`?` for single character, `*` for multi-character) to match many hosts, or `!` to exclude certain hosts. This example includes all `c1*` sized instances, but excludes staging hosts:

```text
datadog:monitored,env:production,!env:staging,instance-type:c1.*
```

See Google's documentation on [Creating and managing labels](https://docs.datadoghq.com/cloud_cost_management/) for more details.
{% /collapsible-section %}

In the below example, only Google Cloud hosts with the label `datadog:true` are monitored by Datadog:

{% image
   source="https://datadog-docs.imgix.net/images/integrations/google_cloud_platform/limit_metric_collection.017236d2933422be21e440bd296bae60.png?auto=format"
   alt="The fields to limit metric collection in the Google Cloud integration tile" /%}

#### Best practices for monitoring multiple projects{% #best-practices-for-monitoring-multiple-projects %}

##### Enable per-project cost and API quota attribution{% #enable-per-project-cost-and-api-quota-attribution %}

By default, Google Cloud attributes the cost of monitoring API calls, as well as API quota usage, to the project containing the service account for this integration. As a best practice for Google Cloud environments with multiple projects, enable per-project cost attribution of monitoring API calls and API quota usage. With this enabled, costs and quota usage are attributed to the project being *queried*, rather than the project containing the service account. This provides visibility into the monitoring costs incurred by each project, and also helps to prevent reaching API rate limits.

To enable this feature:

1. Ensure that the Datadog service account has the [Service Usage Consumer](https://cloud.google.com/service-usage/docs/access-control#serviceusage.serviceUsageConsumer) role at the desired scope (folder or organization).
1. Click the **Enable Per Project Quota** toggle in the **Projects** tab of the [Google Cloud integration page](https://app.datadoghq.com/integrations/google-cloud-platform/).

## Log collection{% #log-collection %}

Forwarding logs from your Google Cloud environment enables near real-time monitoring of the resources and activities taking place in your organization or folder. You can set up [log monitors](https://docs.datadoghq.com/monitors/types/log/) to be notified of issues, use [Cloud SIEM](https://docs.datadoghq.com/security/cloud_siem/) to detect threats, or leverage [Watchdog](https://docs.datadoghq.com/watchdog/) to identify unknown issues or anomalous behavior.

Use the [Datadog Dataflow template](https://cloud.google.com/dataflow/docs/guides/templates/provided/pubsub-to-datadog) to batch and compresses your log events before forwarding them to Datadog through [Google Cloud Dataflow](https://cloud.google.com/dataflow). This is the most network-efficient way to forward your logs. To specify which logs are forwarded, configure the [Google Cloud Logging sink](https://cloud.google.com/logging/docs/routing/overview#sinks) with any inclusion or exclusion queries using Google Cloud's [Logging query language](https://cloud.google.com/logging/docs/view/logging-query-language). See the [Google Cloud Log Forwarding Setup page](https://docs.datadoghq.com/logs/guide/google-cloud-log-forwarding) for log forwarding setup options (including Terraform) and instructions.

{% alert level="danger" %}
The Dataflow API must be enabled to use Google Cloud Dataflow. See [Enabling APIs](https://cloud.google.com/apis/docs/getting-started#enabling_apis) in the Google Cloud documentation for more information.
{% /alert %}

## Leveraging the Datadog Agent{% #leveraging-the-datadog-agent %}

After the Google Cloud integration is configured, Datadog automatically starts collecting Google Cloud metrics. However, you can use the Datadog Agent to gather deeper insights into your infrastructure.

The [Datadog Agent](https://docs.datadoghq.com/agent/) provides the [most granular, low-latency metrics](https://docs.datadoghq.com/extend/guide/data-collection-resolution-retention/#pagetitle:~:text=n/a-,Infrastructure,-Agent%20integrations) from your infrastructure, delivering real-time insights into CPU, memory, disk usage, and more for your Google Cloud hosts. The Agent can be installed on any host, including [GKE](https://docs.datadoghq.com/integrations/gke/?tab=standard).

The Agent also supports a wide range of [integrations](https://docs.datadoghq.com/integrations/), enabling you to extend visibility into specific services and databases running on your hosts.

[Traces](https://docs.datadoghq.com/tracing/) collected through the Agent enable comprehensive Application Performance Monitoring (APM), helping you understand end-to-end service performance.

[Logs](https://docs.datadoghq.com/logs/) collected through the Agent provide visibility into your Google Cloud resources, and the activities taking place in your Google Cloud environment.

For the full list of benefits of installing the Agent on your cloud instances, see [Why should I install the Datadog Agent on my cloud instances?](https://docs.datadoghq.com/agent/guide/why-should-i-install-the-agent-on-my-cloud-instances/)

## Resource changes collection{% #resource-changes-collection %}

Resource changes collection allows you to monitor infrastructure changes in your Google Cloud environment. When Google's Cloud Asset Inventory detects changes in your cloud resources, an event is forwarded to Datadog's [Event Management](https://app.datadoghq.com/event/overview) through a Cloud Pub/Sub topic and subscription. Use these events to be proactively notified of risky changes in your infrastructure, and to assist with troubleshooting.

For detailed setup instructions, see the [resource changes collection section](https://docs.datadoghq.com/integrations/google_cloud_platform/#resource-changes-collection) of the Google Cloud integration documentation.

## Explore related services{% #explore-related-services %}

### Private Service Connect{% #private-service-connect %}

{% alert level="info" %}
Private Service Connect is only available for the US5 and EU Datadog sites.
{% /alert %}

Use the [Google Cloud Private Service Connect integration](https://docs.datadoghq.com/integrations/google_cloud_private_service_connect/) to visualize connections, data transferred, and dropped packets through Private Service Connect. This gives you visibility into important metrics from your Private Service Connect connections, both for producers as well as consumers. [Private Service Connect (PSC)](https://cloud.google.com/vpc/docs/private-service-connect) is a Google Cloud networking product that enables you to access [Google Cloud services](https://cloud.google.com/vpc/docs/private-service-connect-compatibility#google-services), [third-party partner services](https://cloud.google.com/vpc/docs/private-service-connect-compatibility#third-party-services), and company-owned applications directly from your Virtual Private Cloud (VPC).

See [Access Datadog privately and monitor your Google Cloud Private Service Connect usage](https://www.datadoghq.com/blog/google-cloud-private-service-connect/) in the Datadog blog for more information.

### Google Cloud Run{% #google-cloud-run %}

Use the [Google Cloud Run integration](https://docs.datadoghq.com/integrations/google_cloud_run/) to get detailed information on your Cloud Run containers, such as metrics and audit logs.

### Cloud Cost Management (CCM){% #cloud-cost-management-ccm %}

Datadog's [Google Cloud Cost Management](https://docs.datadoghq.com/cloud_cost_management/setup/google_cloud/) provides insights for engineering and finance teams to understand how infrastructure changes impact costs, allocate spend across your organization, and identify potential improvements.

### Cloud SIEM{% #cloud-siem %}

Cloud SIEM provides real-time analysis of operational and security logs, while using out-of-the-box integrations and rules to detect and investigate threats. To use this feature, see [Getting Started with Cloud SIEM](https://docs.datadoghq.com/getting_started/cloud_siem/).

To view security findings from [Google Cloud Security Command Center](https://console.cloud.google.com/projectselector2/security/command-center/overview?supportedpurview=organizationId,folder,project) in Cloud SIEM, toggle the **Enable collection of security findings** option under the **Security Findings** tab and follow the setup instructions on the [Google Cloud Security Command Center guide](https://docs.datadoghq.com/integrations/google_cloud_security_command_center/#installation).

{% image
   source="https://datadog-docs.imgix.net/images/integrations/google_cloud_platform/security_findings.1d8f1788440f5b1b6e9cf76a8ff5ce5d.png?auto=format"
   alt="The security findings tab in the Google Cloud integration tile" /%}

### Cloud Security{% #cloud-security %}

Datadog Cloud Security delivers real-time threat detection and continuous configuration audits across your entire cloud infrastructure. Check out the [Setting up Cloud Security guide](https://docs.datadoghq.com/security/cloud_security_management/setup/) to get started.

After setting up Cloud Security, toggle the **Enable Resource Collection** option under the **Resource Collection** tab to start collecting configuration data for the [Resource Catalog](https://docs.datadoghq.com/infrastructure/resource_catalog/) and Cloud Security. Then, follow these instructions to enable [Misconfigurations and Identity Risks (CIEM)](https://docs.datadoghq.com/security/cloud_security_management/setup/cloud_integrations/?tab=googlecloud) on Google Cloud.

{% image
   source="https://datadog-docs.imgix.net/images/integrations/google_cloud_platform/resource_collection.6071f0666833140be49a4082bb511df3.png?auto=format"
   alt="The resource collection tab in the Google Cloud integration tile" /%}

### Expanded BigQuery monitoring{% #expanded-bigquery-monitoring %}

You can get granular visibility into your BigQuery environments to monitor the performance of your BigQuery jobs and the quality of your BigQuery data. See the [Expanded BigQuery monitoring section](https://docs.datadoghq.com/integrations/google_cloud_platform/#expanded-bigquery-monitoring) in the main Google Cloud integration page for more information and setup instructions.

## Further reading{% #further-reading %}

- [Google Cloud integration](https://docs.datadoghq.com/integrations/google_cloud_platform/?tab=dataflowmethodrecommended)
- [Google Cloud integration billing](https://docs.datadoghq.com/account_management/billing/google_cloud/)
- [Cloud Metric Delay](https://docs.datadoghq.com/integrations/guide/cloud-metric-delay/)
- [Why should I install the Datadog Agent on my cloud instances?](https://docs.datadoghq.com/agent/guide/why-should-i-install-the-agent-on-my-cloud-instances/)
- [New GKE dashboards and metrics provide deeper visibility into your environment](https://www.datadoghq.com/blog/gke-dashboards-integration-improvements/)
- [Access Datadog privately and monitor your Google Cloud Private Service Connect usage](https://www.datadoghq.com/blog/google-cloud-private-service-connect/)
- [Monitor BigQuery with Datadog](https://www.datadoghq.com/blog/track-bigquery-costs-performance/)
- [Empower engineers to take ownership of Google Cloud costs with Datadog](https://www.datadoghq.com/blog/google-cloud-cost-management/)
- [Collect traces, logs, and custom metrics from your Google Cloud Run services with Datadog](https://www.datadoghq.com/blog/collect-traces-logs-from-cloud-run-with-datadog/)
