이 페이지는 아직 한국어로 제공되지 않으며 번역 작업 중입니다. 번역에 관한 질문이나 의견이 있으시면 언제든지 저희에게 연락해 주십시오.

Overview

To use Google Cloud Cost Management in Datadog, follow these steps:

  1. Configure the Google Cloud Platform Integration
  2. Set up the detailed usage cost export with the necessary permissions (Google Service APIs, export project access, and BigQuery Dataset access)
  3. Create or select a Google Cloud Storage bucket with the necessary permissions (Bucket access)

Setup

Configure the Google Cloud Platform integration

Navigate to Setup & Configuration, and select a Google Cloud Platform integration. If you do not see your desired Service Account in the list, go to the Google Cloud Platform integration to configure it.

The Datadog Google Cloud Platform integration allows Cloud Costs to automatically monitor all projects this service account has access to. To limit infrastructure monitoring hosts for these projects, apply tags to the hosts. Then define whether the tags should be included or excluded from monitoring in the Limit Metric Collection Filters section of the integration page.
Limit metric collection filters section configured in the Google Cloud Platform integration page

Enable detailed usage cost export

The detailed usage cost data provides all the information included in the standard usage cost data, along with additional fields that provide granular, resource-level cost data.
  1. Navigate to Billing Export under Google Cloud console Billing.
  2. Enable the Detailed Usage cost export (select or create a project and a BigQuery dataset).
  3. Document the Billing Account ID for the billing account where the export was configured, as well as the export Project ID and Dataset Name.
Google Cloud project and dataset info highlighted

Newly created BigQuery billing export datasets only contain the most recent two months of data. It can take a day or two for this data to backfill in BigQuery.

Enable Google Service APIs

The following permissions allow Datadog to access and transfer the billing export into the storage bucket using a scheduled BigQuery query.

  • Enable the BigQuery API.

    1. In the Google Cloud console, go to the project selector page and select your Google Cloud project.
    2. Enable billing on your project for all transfers.
  • Enable the BigQuery Data Transfer Service.

    1. Open the BigQuery Data Transfer API page in the API library.
    2. From the dropdown menu, select the project that contains the service account.
    3. Click the ENABLE button.

    Note: BigQuery Data Transfer API needs to be enabled on the Google Project that contains the service account.

Configure export project access

Add the service account as a principal on the export dataset project resource:

  1. Navigate to the IAM page in the Google Cloud console and select the export dataset project.
  2. Select the service account as a principal.
  3. Select a role with the following permissions to grant from the drop-down list:
  • bigquery.jobs.create
  • bigquery.transfers.get
  • bigquery.transfers.update

Note: This can be a custom role, or you can use the existing Google Cloud role roles/bigquery.admin.

Configure export BigQuery dataset access

Add the service account as a principal on the export BigQuery dataset resource:

  1. In the Explorer pane on the BigQuery page, expand your project and select the export BigQuery dataset.
  2. Click Sharing > Permissions and then add principal.
  3. In the new principals field, enter the service account.
  4. Using the select a role list, assign a role with the following permissions:
  • bigquery.datasets.get
  • bigquery.tables.create
  • bigquery.tables.delete
  • bigquery.tables.export
  • bigquery.tables.get
  • bigquery.tables.getData
  • bigquery.tables.list
  • bigquery.tables.update
  • bigquery.tables.updateData

Note: This can be a custom role, or you can use the existing Google Cloud role roles/bigquery.dataEditor.

Create or select a Google Cloud Storage bucket

Use an existing Google Cloud Storage bucket or create a new one. Data is extracted regularly from your Detailed Usage Cost BigQuery dataset to the selected bucket and prefixed with datadog_cloud_cost_detailed_usage_export.

Note: The bucket must be co-located with the BigQuery export dataset.

Configure bucket access

Add the service account as a principal on the GCS bucket resource:

  1. Navigate to the Cloud Storage Buckets page in the Google Cloud console, and select your bucket.
  2. Select the permissions tab and click the grant access button.
  3. In the new principals field, enter the service account.
  4. Assign a role with the following permissions:
    • storage.buckets.get
    • storage.objects.create
    • storage.objects.delete
    • storage.objects.get
    • storage.objects.list

Note: This can be a custom role, or you can use the existing Google Cloud roles roles/storage.legacyObjectReader and roles/storage.legacyBucketWriter.

(Optional) Configure cross-project service authorization:

If your integrated Service Account exists in a different Google Cloud Platform project than your billing export dataset, you need to grant cross-project service account authorization:

  1. Trigger the service agent creation by following the official documentation using the following values:

    • ENDPOINT: bigquerydatatransfer.googleapis.com

    • RESOURCE_TYPE: project

    • RESOURCE_ID: export dataset project

      This creates a new service agent that looks like service-<billing project number>@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com.

  2. Add the BigQuery Data Transfer Service Account role created by the trigger as a principal on your service account

  3. Assign it the roles/iam.serviceAccountTokenCreator role.

Configure Cloud Costs

Continue to follow the steps indicated in Setup & Configuration.

Cost types

You can visualize your ingested data using the following cost types:

Cost TypeDescription
gcp.cost.amortizedTotal cost of resources allocated at the time of usage over an interval. Costs include promotion credits as well as committed usage discount credits.
gcp.cost.amortized.shared.resources.allocatedAll of your Google Cloud Platform amortized costs, with additional breakdowns and insights for container workloads. Requires container cost allocation.
gcp.cost.ondemandTotal public, on-demand cost of resources before public and private discounts are applied over an interval.

Out-of-the-box tags

Datadog adds out-of-the-box tags to ingested cost data to help you further break down and allocate your costs. These tags are derived from your detailed usage cost report and make it easier to discover and understand cost data.

The following out-of-the-box tags are available for filtering and grouping data:

Tag NameTag Description
google_productThe Google service being billed.
google_cost_typeThe type of charge covered by this item (for example, regular, tax, adjustment, or rounding error).
google_usage_typeThe usage details of the item (for example, Standard Storage US).
google_locationThe location associated with the item at the level of a multi-region, country, region, or zone.
google_regionThe region associated with the item.
google_zoneThe availability zone associated with the item.
google_pricing_usage_unitThe pricing unit used for calculating the usage cost (for example, gibibyte, tebibyte, or year).
google_is_unused_reservationWhether the usage was reserved but not used.
service_descriptionThe Google Cloud service (such as Compute Engine or BigQuery).
project_idThe ID of the Google Cloud project that generated the Cloud Billing data.
project_nameThe name of the Google Cloud project that generated the Cloud Billing data.
cost_typeThe type of cost this line item represents: regular, tax, adjustment, or rounding error.
sku_descriptionA description of the resource type used, describing the usage details of the resource.
resource_nameA name customers add to resources. This may not be on all resources.
global_resource_nameA globally unique resource identifier generated by Google Cloud.

Cost and observability correlation

Viewing costs in context of observability data is important to understand how infrastructure changes impact costs, identify why costs change, and optimize infrastructure for both costs and performance. Datadog updates resource identifying tags on cost data for top Google products to simplify correlating observability and cost metrics.

For example, to view cost and utilization for each Cloud SQL database, you can make a table with gcp.cost.amortized, gcp.cloudsql.database.cpu.utilization, and gcp.cloudsql.database.memory.utilization (or any other Cloud SQL metric) and group by database_id. Or, to see Cloud Function usage and costs side by side, you can graph gcp.cloudfunctions.function.execution_count and gcp.cost.amortized grouped by function_name.

The following out-of-the-box tags are available:

Google ProductTag(s)
Compute Engineinstance_id, instance-type
Cloud Functionsfunction_name
Cloud Runjob_name, service_name
Cloud SQLdatabase_id
Cloud Spannerinstance_id
App Enginemodule_id
BigQueryproject_id, dataset_id
Kubernetes Enginecluster_name

Container allocation

Container allocation metrics contain all of the same costs as the Google Cloud Platform metrics, but with additional breakdowns and insights for container workloads. See Container Cost Allocation for more details.

Further reading