- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
To use Google Cloud Cost Management in Datadog, follow these steps:
Navigate to Setup & Configuration, and select a Google Cloud Platform integration. If you do not see your desired Service Account in the list, go to the Google Cloud Platform integration to configure it.
Billing Account ID
for the billing account where the export was configured, as well as the export Project ID
and Dataset Name
.Newly created BigQuery billing export datasets only contain the most recent two months of data. It can take a day or two for this data to backfill in BigQuery.
The following permissions allow Datadog to access and transfer the billing export into the storage bucket using a scheduled BigQuery query.
Enable the BigQuery API.
Enable the BigQuery Data Transfer Service.
Note: BigQuery Data Transfer API needs to be enabled on the Google Project that contains the service account.
Add the service account as a principal on the export dataset project resource:
bigquery.jobs.create
bigquery.transfers.get
bigquery.transfers.update
Note: This can be a custom role, or you can use the existing Google Cloud role roles/bigquery.admin
.
Add the service account as a principal on the export BigQuery dataset resource:
bigquery.datasets.get
bigquery.tables.create
bigquery.tables.delete
bigquery.tables.export
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.list
bigquery.tables.update
bigquery.tables.updateData
Note: This can be a custom role, or you can use the existing Google Cloud role roles/bigquery.dataEditor
.
Use an existing Google Cloud Storage bucket or create a new one.
Data is extracted regularly from your Detailed Usage Cost BigQuery dataset to the selected bucket and prefixed with datadog_cloud_cost_detailed_usage_export
.
Note: The bucket must be co-located with the BigQuery export dataset.
Add the service account as a principal on the GCS bucket resource:
storage.buckets.get
storage.objects.create
storage.objects.delete
storage.objects.get
storage.objects.list
Note: This can be a custom role, or you can use the existing Google Cloud roles roles/storage.legacyObjectReader
and roles/storage.legacyBucketWriter
.
If your integrated Service Account exists in a different Google Cloud Platform project than your billing export dataset, you need to grant cross-project service account authorization:
Trigger the service agent creation by following the official documentation using the following values:
ENDPOINT: bigquerydatatransfer.googleapis.com
RESOURCE_TYPE: project
RESOURCE_ID: export dataset project
This creates a new service agent that looks like service-<billing project number>@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com
.
Add the BigQuery Data Transfer Service Account role created by the trigger as a principal on your service account
Assign it the roles/iam.serviceAccountTokenCreator
role.
Continue to follow the steps indicated in Setup & Configuration.
You can visualize your ingested data using the following cost types:
Cost Type | Description |
---|---|
gcp.cost.amortized | Total cost of resources allocated at the time of usage over an interval. Costs include promotion credits as well as committed usage discount credits. |
gcp.cost.amortized.shared.resources.allocated | All of your Google Cloud Platform amortized costs, with additional breakdowns and insights for container workloads. Requires container cost allocation. |
Datadog adds out-of-the-box tags to ingested cost data to help you further break down and allocate your costs. These tags are derived from your detailed usage cost report and make it easier to discover and understand cost data.
The following out-of-the-box tags are available for filtering and grouping data:
Tag Name | Tag Description |
---|---|
google_product | The Google service being billed. |
google_cost_type | The type of charge covered by this item (for example, regular, tax, adjustment, or rounding error). |
google_usage_type | The usage details of the item (for example, Standard Storage US). |
google_location | The location associated with the item at the level of a multi-region, country, region, or zone. |
google_region | The region associated with the item. |
google_zone | The availability zone associated with the item. |
google_pricing_usage_unit | The pricing unit used for calculating the usage cost (for example, gibibyte, tebibyte, or year). |
google_is_unused_reservation | Whether the usage was reserved but not used. |
service_description | The Google Cloud service (such as Compute Engine or BigQuery). |
project_id | The ID of the Google Cloud project that generated the Cloud Billing data. |
project_name | The name of the Google Cloud project that generated the Cloud Billing data. |
cost_type | The type of cost this line item represents: regular , tax , adjustment , or rounding error . |
sku_description | A description of the resource type used, describing the usage details of the resource. |
resource_name | A name customers add to resources. This may not be on all resources. |
global_resource_name | A globally unique resource identifier generated by Google Cloud. |
Viewing costs in context of observability data is important to understand how infrastructure changes impact costs, identify why costs change, and optimize infrastructure for both costs and performance. Datadog updates resource identifying tags on cost data for top Google products to simplify correlating observability and cost metrics.
For example, to view cost and utilization for each Cloud SQL database, you can make a table with gcp.cost.amortized
, gcp.cloudsql.database.cpu.utilization
, and gcp.cloudsql.database.memory.utilization
(or any other Cloud SQL metric) and group by database_id
. Or, to see Cloud Function usage and costs side by side, you can graph gcp.cloudfunctions.function.execution_count
and gcp.cost.amortized
grouped by function_name
.
The following out-of-the-box tags are available:
Google Product | Tag(s) |
---|---|
Compute Engine | instance_id , instance-type |
Cloud Functions | function_name |
Cloud Run | job_name , service_name |
Cloud SQL | database_id |
Cloud Spanner | instance_id |
App Engine | module_id |
BigQuery | project_id , dataset_id |
Kubernetes Engine | cluster_name |
Container allocation metrics contain all of the same costs as the Google Cloud Platform metrics, but with additional breakdowns and insights for container workloads. See Container Cost Allocation for more details.