Cloud Cost Management is not supported for this site.
Join the Beta!
Cloud Cost for Google Cloud is in private beta
Overview
To use Google Cloud Cost Management in Datadog, you must configure the Google Cloud Platform Integration and set up the detailed usage cost billing export for your desired billing account. Additionally, the Google Cloud Platform Datadog Integration Service Account must have the necessary permissions configured to interact with Google Cloud Storage and BigQuery.
Contact Support to sign up for private beta or for additional assistance.
Setup
Enable detailed usage cost export
- Navigate to Billing Export under Google Cloud console Billing.
- Enable the Detailed Usage cost export, (select or create a project and a BigQuery dataset).
- Document the
billing account id
for the billing account where the export was configured, as well as the export project id
and dataset name
.
Create or select a Google Cloud Storage bucket
Use an existing GCS bucket or create a new one. Data is extracted regularly from your Detailed Usage Cost BigQuery dataset to a datadog_cloud_cost_detailed_usage_export
prefix in the selected bucket.
Note: The bucket must be co-located with the BigQuery export dataset.
- Navigate to Setup & Configuration, and select an integrated Google Cloud Platform service account from the menu.
- If you do not see your desired Service Account in the list, go to the Google Cloud Platform integration to add it.
- Follow the instructions on the integration page to properly configure your Service Account being integrated.
LEGACY project integrations are deprecated and not supported.
Datadog Google Cloud Platform Integrations monitor the entire project when a related service account is integrated. Using a previously integrated project prevents monitoring resources in a new project. If your billing is associated with a non-integrated project, those resources are monitored.
Note: You can limit Metrics Collection per integration - this limits automatically monitoring resources in the project, but does not exclude data from Cloud Cost processing.
Provide service account necessary permissions
The following APIs and permissions enable Datadog to access your Detailed Usage billing export data and extract it in a useful format. This dumps data from BigQuery, where Google Cloud exports it, to your specified GCS bucket through a scheduled BigQuery query. The data is exported in a useful manner in your cloud storage bucket where it can then be processed by Datadog.
Enable necessary Google Service APIs:
Enable BigQuery and BigQuery Data Transfer Service APIs if not already enabled.
Add the service account as a principal on the GCS bucket resource and assign a role with the following permissions:
storage.buckets.get
storage.objects.create
storage.objects.delete
storage.objects.get
storage.objects.list
Note: This can be a custom role, or you can use the existing Google Cloud roles roles/storage.legacyObjectReader
and roles/storage.legacyBucketWriter
.
Add the service account as a principal on the export dataset project resource and assign a role with the following permissions:
bigquery.jobs.create
bigquery.transfers.get
bigquery.transfers.update
Note: This can be a custom role, or you can use the existing Google Cloud role roles/bigquery.admin
.
Add the service account as a principal on the export BigQuery dataset resource and assign a role with the following permissions:
bigquery.datasets.get
bigquery.tables.create
bigquery.tables.delete
bigquery.tables.export
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.list
bigquery.tables.update
bigquery.tables.updateData
Note: This can be a custom role, or you can use the existing Google Cloud role roles/bigquery.dataEditor
.
If your integrated Service Account exists in a different Google Cloud Platform project than your billing export dataset, you need to grant cross-project service account authorization:
Trigger service agent creation using the following values:
- ENDPOINT:
bigquerydatatransfer.googleapis.com
- RESOURCE_TYPE:
project
- RESOURCE_ID: export dataset project
This creates a new service agent that looks like service-<billing project id>@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com
.
Add the BigQuery Data Transfer Service Account role created by the trigger as a principal on your Service Account resource with the roles/iam.serviceAccountTokenCreator
role.
Navigate to Setup & Configuration and follow the steps provided in-app.
Cost types
You can visualize your ingested data using the following cost types:
Cost Type | Description |
---|
gcp.cost.usage_date | Total cost of resources allocated at the time of usage over an interval. Costs include promotion credits as well as committed usage discount credits. |
Further reading
Additional helpful documentation, links, and articles: