- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
SaaS Cost Integrations are in Preview.
SaaS Cost Integrations allow you to send cost data directly from your providers by configuring the accounts associated with your cloud cost data in Datadog.
If your provider is not supported, use Custom Costs to upload any cost data source to Datadog and understand the total cost of your services.
To use SaaS Cost Integrations, you must configure Cloud Cost Management for AWS, Azure, or Google Cloud.
See the respective documentation for your cloud provider:
Navigate to Infrastructure > Cloud Costs > Settings > Accounts and click Configure on a provider to collect cost data.
Cloud Cost Management
.System Tables SQL Warehouse ID
corresponding to your Databricks instance’s warehouse to query for system table billing data.Your Databricks cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.
Collect cost data to view in Cloud Cost Management
.Your Confluent Cloud cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.
If you wish to collect cluster-level tags or business metadata tags for your costs, you can add a Schema Registry API key and secret. Please look into Schema Management on Confluent Cloud for more information.
Organizational Billing Viewer
permissions, and add Organizational Read Only
permissions for cluster resource tags.Your MongoDB cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.
Navigate to the Snowflake integration tile in Datadog and click Add Snowflake Account.
Enter your Snowflake account URL, for example: https://xyz12345.us-east-1.snowflakecomputing.com
.
Under the Connect your Snowflake account section, click the toggle to enable Snowflake in Cloud Cost Management.
Enter your Snowflake user name in the User Name
field.
Create a Datadog-specific role and user to monitor Snowflake.
Run the following in Snowflake to create a custom role:
-- Create a new role intended to monitor Snowflake usage.
create role DATADOG;
-- Grant privileges on the SNOWFLAKE database to the new role.
grant imported privileges on database SNOWFLAKE to role DATADOG;
-- Grant usage to your default warehouse to the role DATADOG.
grant usage on warehouse <WAREHOUSE> to role DATADOG;
-- If you have cost usage collection enabled, ensure that your credentials have permission to view the ORGANIZATION_USAGE schema.
grant role orgadmin to role DATADOG
-- Create a user.
create user DATADOG_USER
LOGIN_NAME = DATADOG_USER
password = <PASSWORD>
default_warehouse = <WAREHOUSE>
default_role = DATADOG
-- Grant the monitor role to the user.
grant role DATADOG to user <USER>
Configure the key-value pair authentication:
Click Save.
Your Snowflake cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.
Your Elastic Cloud cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.
OpenAI Billing Usage Data Collection
.Your OpenAI cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.
"global:read"
scope and "Billing"
role on the Personal API tokens page in Fastly.Your Fastly cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.
Twilio in Cloud Cost Management
.Account SID
for your Twilio account.Your Twilio cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.
You can view cost data on the Cloud Costs Analytics page, the Cloud Costs Tag Explorer, and in dashboards, notebooks, or monitors. You can also combine these cost metrics with other cloud cost metrics or observability metrics.
The following table contains a non-exhaustive list of out-of-the-box tags associated with each SaaS Cost integration.
Tag Name | Tag Description |
---|---|
record_id | Unique ID for this record. |
account_id | ID of the account this report was generated for. |
workspace_id | ID of the Workspace this usage was associated with. |
cloud | Cloud this usage is relevant for. Possible values are AWS, AZURE, and GCP. |
billing_origin_product | Product or feature originating the billing event (for example, JOBS, CLUSTERS). |
usage_type | Type of usage being billed (for example, COMPUTE_TIME). |
job_run_id | Identifier for the specific job run (if applicable). |
node_type | Type of node used in this billing record (for example, m5d.large). |
destination_region | Region where the workload is directed (if applicable). |
central_clean_room_id | ID of the central clean room associated with the workload (if applicable). |
notebook_path | Path to the notebook in Databricks (if applicable). |
job_name | Name of the job in Databricks (if applicable). |
notebook_id | ID of the notebook used in this billing record (if applicable). |
dlt_update_id | Delta Live Table update ID associated with this usage (if applicable). |
job_id | Unique identifier for the job in Databricks. |
dlt_maintenance_id | Maintenance ID for Delta Live Tables (if applicable). |
run_name | Name of the current job or workflow run (if applicable). |
instance_pool_id | ID of the instance pool used (if applicable). |
cluster_id | ID of the cluster associated with this usage. |
endpoint_id | ID of the endpoint for SQL-based or serving-related usage (if applicable). |
warehouse_id | ID of the SQL warehouse (if applicable). |
source_region | Originating region for this billing record (if applicable). |
dlt_pipeline_id | ID of the Delta Live Tables pipeline (if applicable). |
endpoint_name | Name of the SQL or serving endpoint (if applicable). |
is_photon | Indicates whether Photon processing was used (true or false ). |
dlt_tier | Tier of Delta Live Tables service (if applicable). |
jobs_tier | Tier of the job, such as CLASSIC or PREMIUM . |
networking | Type of networking used for this job, if specified. |
serving_type | Type of serving model used, if applicable (for example, Model Serving). |
sql_tier | SQL tier associated with the usage (if applicable). |
is_serverless | Indicates if the usage pertains to a serverless compute resource (true or false ). |
custom_tags | Custom tags applied to the usage, usually as key-value pairs for additional metadata or categorization. |
usage_metadata | Metadata related to the usage, which might include details like usage type, service category, or other relevant information. |
Tag Name | Tag Description |
---|---|
resource_id | The unique identifier of the Confluent resource. |
resource_name | The name of the Confluent resource. |
environment_id | The unique identifier for the environment. |
network_access_type | Network access type for the cluster. Possible values are INTERNET , TRANSIT_GATEWAY , PRIVATE_LINK , and PEERED_VPC . |
product | Product name. Possible values include KAFKA , CONNECT , KSQL , AUDIT_LOG , STREAM_GOVERNANCE , CLUSTER_LINK , CUSTOM_CONNECT , FLINK , SUPPORT_CLOUD_BASIC , SUPPORT_CLOUD_DEVELOPER , SUPPORT_CLOUD_BUSINESS , and SUPPORT_CLOUD_PREMIER . |
Tag Name | Tag Description |
---|---|
organization_name | Name of the organization. |
contract_number | Snowflake contract number for the organization. |
account_name | Name of the account where the usage was consumed. |
account_locator | Locator for the account where the usage was consumed. |
region | Name of the region where the account is located. |
service_level | Service level (edition) of the Snowflake account (Standard, Enterprise, or Business Critical). |
user_name | Name of the user or service account associated with the query. |
warehouse_id | Identifier for the warehouse generating the cost. |
warehouse_name | Name of the warehouse associated with this usage. |
warehouse_size | Size of the warehouse (for example, Large, Medium). |
cost_type | Type of cost associated with the usage. Possible values include: - CLOUD_SERVICES : General costs related to Snowflake’s underlying cloud services, excluding warehouse usage.- IDLE_OR_LESS_100MS : Costs from warehouse idle time or queries that completed in under 100 milliseconds. Unattributed to specific queries. Falls under the warehouse_metering service type.- QUERY_ATTRIBUTION : Costs attributed to specific queries, grouped by the parameterized query hash. For these costs, the parameterized query associated with this cost can be found under charge description. Falls under the warehouse_metering service type. |
query_hash | Unique hash representing a parameterized version of the query for attribution purposes. Only found for query attribution costs. |
query_hash_version | Version of the Snowflake query hash algorithm used to generate query_hash . Only found for query attribution costs. |
database_name | Name of the database in which the query was executed (if applicable). Only found for query attribution costs. |
balance_source | Source of the funds used to pay for the daily usage. The source can be one of the following: - capacity: Usage paid with credits remaining on an organization’s capacity commitment. - rollover: Usage paid with rollover credits. When an organization renews a capacity commitment, unused credits are added to the balance of the new contract as rollover credits. - free usage: Usage covered by the free credits provided to the organization. - overage: Usage that was paid at on-demand pricing, which occurs when an organization has exhausted its capacity, rollover, and free credits. - rebate: Usage covered by the credits awarded to the organization when it shared data with another organization. |
service_type | Type of usage. Possible service types include: - automatic_clustering: Refer to Automatic Clustering. - data_transfer: Refer to Understanding data transfer cost. - logging: Refer to Logging and Tracing Overview. - materialized_view: Refer to Working with Materialized Views. - replication: Refer to Introduction to replication and failover across multiple accounts. - query_acceleration: Refer to Using the Query Acceleration Service. - search_optimization: Refer to Search Optimization Service. - serverless_task: Refer to Introduction to tasks. - snowpipe: Refer to Snowpipe. - snowpipe_streaming: Refer to Snowpipe Streaming. - storage: Refer to Understanding storage cost. - warehouse_metering: Refer to Virtual warehouse credit usage. Does not indicate usage of serverless or cloud services compute. |
rating_type | Indicates how the usage in the record is rated, or priced. Possible values include: - compute - data_transfer - storage - Other |
billing_type | Indicates what is being charged or credited. Possible billing types include: - consumption: Usage associated with compute credits, storage costs, and data transfer costs. - rebate: Usage covered by the credits awarded to the organization when it shared data with another organization. - priority support: Charges for priority support services. This charge is associated with a stipulation in a contract, not with an account. - vps_deployment_fee: Charges for a Virtual Private Snowflake deployment. - support_credit: Snowflake Support credited the account to reverse charges attributed to an issue in Snowflake. |
Tag Name | Tag Description |
---|---|
name | The unique identifier of the Elastic Cloud resource. |
price_per_hour | The cost of the Elastic Cloud resource per hour. |
kind | The type of resource. |
Tag Name | Tag Description |
---|---|
invoice_id | The unique identifier of the invoice. |
status | State of the payment. |
mongo_org_id | MongoDB organization ID. |
cluster_name | The name of the cluster that incurred the charge. |
group_id | ID of the project with which the line item is associated. |
replica_set_name | Name of the replica set with which the line item is associated. |
resource_tags | Arbitrary tags on clusters set by users, usually as key-value pairs. |
Tag Name | Tag Description |
---|---|
organization_id | The unique identifier of the organization. |
project_id | The unique identifier of the project. |
project_name | The name of the project. |
organization_name | The name of the organization. |
Tag Name | Tag Description |
---|---|
credit_coupon_code | Code of any coupon or credit applied to this cost entry (if available). |
product_name | Name of the specific product being billed (for example, “North America Bandwidth”). |
product_group | Group or category of the product, such as “Full Site Delivery”. |
product_line | Line of products to which this item belongs, for example, “Network Services”. |
usage_type | Type of usage being billed (for example, “Bandwidth”). |
region | Region where the service usage occurred (for example, “North America”). |
service_name | Name of the service associated with this cost entry, often matching the product_name . |
usage_type_cd | Code or label representing the type of usage, such as “North America Bandwidth”. |
plan_name | Name of the plan under which this service falls, often matching “product_line”. |
Tag Name | Tag Description |
---|---|
account_sid | Alphanumeric string identifying the Twilio account. |
category | The category of usage. For more information, see Usage Categories. |
count_unit | The units in which count is measured, such as calls for calls or messages for SMS. |
usage_unit | The units in which usage is measured, such as minutes for calls or messages for SMS. |
추가 유용한 문서, 링크 및 기사: