This page is not yet available in Spanish. We are working on its translation.
If you have any questions or feedback about our current translation project,
feel free to reach out to us!Overview
To use Google Cloud Cost Management in Datadog, follow these steps:
- Configure the Google Cloud Platform Integration
- Set up the detailed usage cost export with the necessary permissions (Google Service APIs, export project access, and BigQuery Dataset access)
- Create or select a Google Cloud Storage bucket with the necessary permissions (Bucket access)
Setup
Navigate to Setup & Configuration, and select a Google Cloud Platform integration.
If you do not see your desired Service Account in the list, go to the Google Cloud Platform integration to configure it.
The Datadog Google Cloud Platform integration allows Cloud Costs to automatically monitor all projects this service account has access to.
To limit infrastructure monitoring hosts for these projects, apply tags to the hosts. Then define whether the tags should be included or excluded from monitoring in the Limit Metric Collection Filters section of the integration page.
Enable detailed usage cost export
The
detailed usage cost data provides all the information included in the standard usage cost data, along with additional fields that provide granular, resource-level cost data.
- Navigate to Billing Export under Google Cloud console Billing.
- Enable the Detailed Usage cost export (select or create a project and a BigQuery dataset).
- Document the
Billing Account ID
for the billing account where the export was configured, as well as the export Project ID
and Dataset Name
.
Newly created BigQuery billing export datasets only contain the most recent two months of data. It can take a day or two for this data to backfill in BigQuery.
Enable Google Service APIs
The following permissions allow Datadog to access and transfer the billing export into the storage bucket using a scheduled BigQuery query.
Add the service account as a principal on the export dataset project resource:
- Navigate to the IAM page in the Google Cloud console and select the export dataset project.
- Select the service account as a principal.
- Select a role with the following permissions to grant from the drop-down list:
bigquery.jobs.create
bigquery.transfers.get
bigquery.transfers.update
Note: This can be a custom role, or you can use the existing Google Cloud role roles/bigquery.admin
.
Add the service account as a principal on the export BigQuery dataset resource:
- In the Explorer pane on the BigQuery page, expand your project and select the export BigQuery dataset.
- Click Sharing > Permissions and then add principal.
- In the new principals field, enter the service account.
- Using the select a role list, assign a role with the following permissions:
bigquery.datasets.get
bigquery.tables.create
bigquery.tables.delete
bigquery.tables.export
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.list
bigquery.tables.update
bigquery.tables.updateData
Note: This can be a custom role, or you can use the existing Google Cloud role roles/bigquery.dataEditor
.
Create or select a Google Cloud Storage bucket
Use an existing Google Cloud Storage bucket or create a new one.
Data is extracted regularly from your Detailed Usage Cost BigQuery dataset to the selected bucket and prefixed with datadog_cloud_cost_detailed_usage_export
.
Note: The bucket must be co-located with the BigQuery export dataset.
Add the service account as a principal on the GCS bucket resource:
- Navigate to the Cloud Storage Buckets page in the Google Cloud console, and select your bucket.
- Select the permissions tab and click the grant access button.
- In the new principals field, enter the service account.
- Assign a role with the following permissions:
storage.buckets.get
storage.objects.create
storage.objects.delete
storage.objects.get
storage.objects.list
Note: This can be a custom role, or you can use the existing Google Cloud roles roles/storage.legacyObjectReader
and roles/storage.legacyBucketWriter
.
If your integrated Service Account exists in a different Google Cloud Platform project than your billing export dataset, you need to grant cross-project service account authorization:
Trigger the service agent creation by following the official documentation using the following values:
ENDPOINT: bigquerydatatransfer.googleapis.com
RESOURCE_TYPE: project
RESOURCE_ID: export dataset project
This creates a new service agent that looks like service-<billing project number>@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com
.
Add the BigQuery Data Transfer Service Account role created by the trigger as a principal on your service account
Assign it the roles/iam.serviceAccountTokenCreator
role.
Continue to follow the steps indicated in Setup & Configuration.
Cost types
You can visualize your ingested data using the following cost types:
Cost Type | Description |
---|
gcp.cost.amortized | Total cost of resources allocated at the time of usage over an interval. Costs include promotion credits as well as committed usage discount credits. |
gcp.cost.amortized.shared.resources.allocated | All of your Google Cloud Platform amortized costs, with additional breakdowns and insights for container workloads. Requires container cost allocation. |
gcp.cost.ondemand | Total public, on-demand cost of resources before public and private discounts are applied over an interval. |
Datadog adds out-of-the-box tags to ingested cost data to help you further break down and allocate your costs. These tags are derived from your detailed usage cost report and make it easier to discover and understand cost data.
The following out-of-the-box tags are available for filtering and grouping data:
Tag Name | Tag Description |
---|
google_product | The Google service being billed. |
google_cost_type | The type of charge covered by this item (for example, regular, tax, adjustment, or rounding error). |
google_usage_type | The usage details of the item (for example, Standard Storage US). |
google_location | The location associated with the item at the level of a multi-region, country, region, or zone. |
google_region | The region associated with the item. |
google_zone | The availability zone associated with the item. |
google_pricing_usage_unit | The pricing unit used for calculating the usage cost (for example, gibibyte, tebibyte, or year). |
google_is_unused_reservation | Whether the usage was reserved but not used. |
service_description | The Google Cloud service (such as Compute Engine or BigQuery). |
project_id | The ID of the Google Cloud project that generated the Cloud Billing data. |
project_name | The name of the Google Cloud project that generated the Cloud Billing data. |
cost_type | The type of cost this line item represents: regular , tax , adjustment , or rounding error . |
sku_description | A description of the resource type used, describing the usage details of the resource. |
resource_name | A name customers add to resources. This may not be on all resources. |
global_resource_name | A globally unique resource identifier generated by Google Cloud. |
Cost and observability correlation
Viewing costs in context of observability data is important to understand how infrastructure changes impact costs, identify why costs change, and optimize infrastructure for both costs and performance. Datadog updates resource identifying tags on cost data for top Google products to simplify correlating observability and cost metrics.
For example, to view cost and utilization for each Cloud SQL database, you can make a table with gcp.cost.amortized
, gcp.cloudsql.database.cpu.utilization
, and gcp.cloudsql.database.memory.utilization
(or any other Cloud SQL metric) and group by database_id
. Or, to see Cloud Function usage and costs side by side, you can graph gcp.cloudfunctions.function.execution_count
and gcp.cost.amortized
grouped by function_name
.
The following out-of-the-box tags are available:
Google Product | Tag(s) |
---|
Compute Engine | instance_id , instance-type |
Cloud Functions | function_name |
Cloud Run | job_name , service_name |
Cloud SQL | database_id |
Cloud Spanner | instance_id |
App Engine | module_id |
BigQuery | project_id , dataset_id |
Kubernetes Engine | cluster_name |
Container allocation
Container allocation metrics contain all of the same costs as the Google Cloud Platform metrics, but with additional breakdowns and insights for container workloads. See Container Cost Allocation for more details.
Further reading
Más enlaces, artículos y documentación útiles: