---
title: Google Cloud
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > Cloud Cost Management > Setup > Google Cloud
---

# Google Cloud

## Overview{% #overview %}

To use Google Cloud Cost Management in Datadog, follow these steps:

1. Configure the [Google Cloud Platform Integration](https://docs.datadoghq.com/integrations/google_cloud_platform/)
1. Set up the [detailed usage cost export](https://docs.datadoghq.com/cloud_cost_management/setup/google_cloud/#enable-detailed-usage-cost-export) with the necessary permissions (Google Service APIs, export project access, and BigQuery Dataset access)
1. Create or select a [Google Cloud Storage bucket](https://docs.datadoghq.com/cloud_cost_management/setup/google_cloud/#create-or-select-a-google-cloud-storage-bucket) with the necessary permissions (Bucket access)

## Setup{% #setup %}

You can setup using the [API](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-google-cloud-usage-cost-config), [Terraform](https://registry.terraform.io/providers/DataDog/datadog/latest/docs/resources/gcp_uc_config), or directly in Datadog by following the instructions below.

### Configure the Google Cloud Platform integration{% #configure-the-google-cloud-platform-integration %}

Navigate to [Setup & Configuration](https://app.datadoghq.com/cost/setup), add a Google Cloud Platform account and follow the steps to configure the Google Cloud Platform integration.

{% alert level="danger" %}
The Datadog Google Cloud Platform integration allows Cloud Costs to automatically monitor all projects this service account has access to. To limit infrastructure monitoring hosts for these projects, apply tags to the hosts. Then define whether the tags should be included or excluded from monitoring in the **Limit Metric Collection Filters** section of the integration page.
{% /alert %}

{% image
   source="https://datadog-docs.imgix.net/images/cloud_cost/gcp_integration_limit_metric_collection.7804f53c784c81ba3f0f53da98b09e73.png?auto=format"
   alt="Limit metric collection filters section configured in the Google Cloud Platform integration page" /%}

### Enable detailed usage cost export{% #enable-detailed-usage-cost-export %}

{% alert level="info" %}
The [detailed usage cost data](https://cloud.google.com/billing/docs/how-to/export-data-bigquery-tables/detailed-usage) provides all the information included in the standard usage cost data, along with additional fields that provide granular, resource-level cost data.
{% /alert %}

1. Navigate to [Billing Export](https://console.cloud.google.com/billing/export/) under Google Cloud console *Billing*.
1. Enable the [Detailed Usage cost](https://cloud.google.com/billing/docs/how-to/export-data-bigquery-setup) export (select or create a project and a BigQuery dataset).
1. Document the `Billing Account ID` for the billing account where the export was configured, as well as the export `Project ID` and `Dataset Name`.

{% image
   source="https://datadog-docs.imgix.net/images/cloud_cost/billing_export.23529e8c0431ae42719df68cdd25baed.png?auto=format"
   alt="Google Cloud project and dataset info highlighted" /%}

*Newly created BigQuery billing export datasets only contain the most recent two months of data. It can take a day or two for this data to backfill in BigQuery.*

#### Enable Google Service APIs{% #enable-google-service-apis %}

The following permissions allow Datadog to access and transfer the billing export into the storage bucket using a scheduled BigQuery query.

- Enable the [BigQuery API](https://cloud.google.com/bigquery/docs/enable-transfer-service).

  1. In the Google Cloud console, go to the project selector page and select your Google Cloud project.
  1. Enable billing on your project for all transfers.

- Enable the [BigQuery Data Transfer Service](https://cloud.google.com/bigquery/docs/enable-transfer-service).

  1. Open the BigQuery Data Transfer API page in the API library.
  1. From the dropdown menu, select the project that contains the service account.
  1. Click the ENABLE button.

**Note:** BigQuery Data Transfer API needs to be enabled on the Google Project that contains the service account.

#### Configure export project access{% #configure-export-project-access %}

[Add the service account as a principal on the export dataset project resource](https://cloud.google.com/iam/docs/granting-changing-revoking-access#grant-single-role):

1. Navigate to the IAM page in the Google Cloud console and select the export dataset project.
1. Select the service account as a principal.
1. Select a role with the following permissions to grant from the drop-down list:

- `bigquery.jobs.create`
- `bigquery.transfers.get`
- `bigquery.transfers.update`

**Note:** This can be a custom role, or you can use the existing Google Cloud role `roles/bigquery.admin`.

#### Configure export BigQuery dataset access{% #configure-export-bigquery-dataset-access %}

[Add the service account as a principal on the export BigQuery dataset resource](https://cloud.google.com/bigquery/docs/control-access-to-resources-iam#grant_access_to_a_dataset):

1. In the Explorer pane on the BigQuery page, expand your project and select the export BigQuery dataset.
1. Click **Sharing > Permissions** and then **add principal**.
1. In the new principals field, enter the service account.
1. Using the select a role list, assign a role with the following permissions:

- `bigquery.datasets.get`
- `bigquery.tables.create`
- `bigquery.tables.delete`
- `bigquery.tables.export`
- `bigquery.tables.get`
- `bigquery.tables.getData`
- `bigquery.tables.list`
- `bigquery.tables.update`
- `bigquery.tables.updateData`

**Note:** This can be a custom role, or you can use the existing Google Cloud role `roles/bigquery.dataEditor`.

### Create or select a Google Cloud Storage bucket{% #create-or-select-a-google-cloud-storage-bucket %}

Use an existing Google Cloud Storage bucket or create a new one. Data is extracted regularly from your Detailed Usage Cost BigQuery dataset to the selected bucket and prefixed with `datadog_cloud_cost_detailed_usage_export`.

**Note:** The bucket [must be co-located](https://cloud.google.com/bigquery/docs/exporting-data#data-locations) with the BigQuery export dataset.

#### Configure bucket access{% #configure-bucket-access %}

[Add the service account as a principal on the GCS bucket resource](https://cloud.google.com/storage/docs/access-control/using-iam-permissions#bucket-add):

1. Navigate to the Cloud Storage Buckets page in the Google Cloud console, and select your bucket.
1. Select the permissions tab and click the **grant access** button.
1. In the new principals field, enter the service account.
1. Assign a role with the following permissions:
   - `storage.buckets.get`
   - `storage.objects.create`
   - `storage.objects.delete`
   - `storage.objects.get`
   - `storage.objects.list`

**Note:** This can be a custom role, or you can use the existing Google Cloud roles `roles/storage.legacyObjectReader` and `roles/storage.legacyBucketWriter`.

### (Optional) Configure cross-project service authorization:{% #optional-configure-cross-project-service-authorization %}

If your integrated Service Account exists in a different Google Cloud Platform project than your billing export dataset, you need to [grant cross-project service account authorization](https://cloud.google.com/bigquery/docs/enable-transfer-service#cross-project_service_account_authorization):

1. Trigger the service agent creation by following the [official documentation](https://cloud.google.com/iam/docs/create-service-agents#create) using the following values:

   - ENDPOINT: `bigquerydatatransfer.googleapis.com`

   - RESOURCE_TYPE: `project`

   - RESOURCE_ID: export dataset project

This creates a new service agent that looks like `service-<billing project number>@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com`.

1. Add the BigQuery Data Transfer Service Account role created by the trigger as a principal on your service account

1. Assign it the `roles/iam.serviceAccountTokenCreator` role.

### Configure Cloud Cost{% #configure-cloud-cost %}

Continue to follow the steps indicated in [Setup & Configuration](https://app.datadoghq.com/cost/setup).

**Note**: Data can take 48 to 72 hours after setup to stabilize in Datadog.

### Getting historical data{% #getting-historical-data %}

Newly created BigQuery billing export datasets only contain the most recent 2 months of data. It can take a day or two for this data to backfill in BigQuery. Datadog automatically ingests up to 15 months of available historical cost data once it appears in the BigQuery table.

Google Cloud does not provide a process for backfilling additional historical data beyond the 2 months automatically included when the BigQuery export is first created.

## Cost types{% #cost-types %}

You can visualize your ingested data using the following cost types:

| Cost Type                                       | Description                                                                                                                                                                                                                            |
| ----------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `gcp.cost.amortized`                            | Total cost of resources allocated at the time of usage over an interval. Costs include promotion credits as well as committed usage discount credits.                                                                                  |
| `gcp.cost.amortized.shared.resources.allocated` | All of your Google Cloud Platform amortized costs, with additional breakdowns and insights for container workloads. Requires [container cost allocation](https://docs.datadoghq.com/cloud_cost_management/container_cost_allocation/). |
| `gcp.cost.ondemand`                             | Total public, on-demand cost of resources before public and private discounts are applied over an interval.                                                                                                                            |

### Out-of-the-box tags{% #out-of-the-box-tags %}

Datadog automatically enriches your Google Cloud cost data with tags from multiple sources. For a comprehensive overview of how tags are applied to cost data, see [Tags](https://docs.datadoghq.com/cloud_cost_management/tags).

The following out-of-the-box tags are derived from your [detailed usage cost report](https://cloud.google.com/billing/docs/how-to/export-data-bigquery-tables/detailed-usage) and make it easier to discover and understand cost data:

| Tag Name                       | Tag Description                                                                                     |
| ------------------------------ | --------------------------------------------------------------------------------------------------- |
| `google_product`               | The Google service being billed.                                                                    |
| `google_cost_type`             | The type of charge covered by this item (for example, regular, tax, adjustment, or rounding error). |
| `google_usage_type`            | The usage details of the item (for example, Standard Storage US).                                   |
| `google_location`              | The location associated with the item at the level of a multi-region, country, region, or zone.     |
| `google_region`                | The region associated with the item.                                                                |
| `google_zone`                  | The availability zone associated with the item.                                                     |
| `google_pricing_usage_unit`    | The pricing unit used for calculating the usage cost (for example, gibibyte, tebibyte, or year).    |
| `google_is_unused_reservation` | Whether the usage was reserved but not used.                                                        |
| `service_description`          | The Google Cloud service (such as Compute Engine or BigQuery).                                      |
| `project_id`                   | The ID of the Google Cloud project that generated the Cloud Billing data.                           |
| `project_name`                 | The name of the Google Cloud project that generated the Cloud Billing data.                         |
| `cost_type`                    | The type of cost this line item represents: `regular`, `tax`, `adjustment`, or `rounding error`.    |
| `sku_description`              | A description of the resource type used, describing the usage details of the resource.              |
| `resource_name`                | A name customers add to resources. This may not be on all resources.                                |
| `global_resource_name`         | A globally unique resource identifier generated by Google Cloud.                                    |

#### Cost and observability correlation{% #cost-and-observability-correlation %}

Viewing costs in context of observability data is important to understand how infrastructure changes impact costs, identify why costs change, and optimize infrastructure for both costs and performance. Datadog updates resource identifying tags on cost data for top Google products to simplify correlating observability and cost metrics.

For example, to view cost and utilization for each Cloud SQL database, you can make a table with `gcp.cost.amortized`, `gcp.cloudsql.database.cpu.utilization`, and `gcp.cloudsql.database.memory.utilization` (or any other Cloud SQL metric) and group by `database_id`. Or, to see Cloud Function usage and costs side by side, you can graph `gcp.cloudfunctions.function.execution_count` and `gcp.cost.amortized` grouped by `function_name`.

The following out-of-the-box tags are available:

| Google Product    | Tag(s)                         |
| ----------------- | ------------------------------ |
| Compute Engine    | `instance_id`, `instance-type` |
| Cloud Functions   | `function_name`                |
| Cloud Run         | `job_name`, `service_name`     |
| Cloud SQL         | `database_id`                  |
| Cloud Spanner     | `instance_id`                  |
| App Engine        | `module_id`                    |
| BigQuery          | `project_id`, `dataset_id`     |
| Kubernetes Engine | `cluster_name`                 |

### Container allocation{% #container-allocation %}

**Container allocation** metrics contain all of the same costs as the Google Cloud Platform metrics, but with additional breakdowns and insights for container workloads. See [Container Cost Allocation](https://docs.datadoghq.com/cloud_cost_management/container_cost_allocation/) for more details.

## Further reading{% #further-reading %}

- [Cloud Cost Management](https://docs.datadoghq.com/cloud_cost_management/)
- [Gain insights into your AWS bill](https://docs.datadoghq.com/cloud_cost_management/setup/aws)
- [Gain insights into your Azure bill](https://docs.datadoghq.com/cloud_cost_management/azure)
- [Gain insights into your Oracle bill](https://docs.datadoghq.com/cloud_cost_management/oracle)
