---
title: Connect BigQuery for Warehouse-Native Experiment Analysis
description: >-
  Connect a BigQuery service account to enable warehouse-native experiment
  analysis.
breadcrumbs: >-
  Docs > Experiments > Experiments Guides > Connect BigQuery for
  Warehouse-Native Experiment Analysis
---

# Connect BigQuery for Warehouse-Native Experiment Analysis

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com

{% alert level="danger" %}
This product is not supported for your selected [Datadog site](https://docs.datadoghq.com/getting_started/site.md). ().
{% /alert %}

{% /callout %}

## Overview{% #overview %}

Warehouse-native experiment analysis lets you run statistical computations directly in your data warehouse.

To set this up for BigQuery, connect a BigQuery service account to Datadog and configure your experiment settings. This guide covers:

- Preparing Google Cloud resources
- Granting permissions to the Datadog service account
- Configuring experiment settings in Datadog

## Prerequisites{% #prerequisites %}

Datadog connects to BigQuery through a Google Cloud service account. If you already have a service account connected to Datadog, skip to Step 1. Otherwise, expand the section below to create one.

{% collapsible-section %}
#### Create a Google Cloud service account

1. Open your [Google Cloud console](https://console.cloud.google.com/).
1. Navigate to **IAM & Admin** > **Service Accounts**.
1. Click **Create service account**.
1. Enter the following:
   1. **Service account name**.
   1. **Service account ID**.
   1. **Service account description**.
1. Click **Create and continue**.
   1. **Note**: The **Permissions** and **Principals with access** settings are optional here. These are configured in Step 2.
1. Click **Done**.

After you create the service account, continue to Step 1 to set up the Google Cloud resources.
{% /collapsible-section %}

{% alert level="info" %}
If you plan to use other Google Cloud observability functionality in Datadog, see [Datadog's Google Cloud Platform integration documentation](https://docs.datadoghq.com/integrations/google-cloud-platform.md#metric-collection) to determine which resources to enable.
{% /alert %}

## Step 1: Prepare the Google Cloud resources{% #step-1-prepare-the-google-cloud-resources %}

Datadog Experiments uses a BigQuery dataset for caching experiment results and a Cloud Storage bucket for staging experiment records.

### Create a BigQuery dataset{% #create-a-bigquery-dataset %}

1. Open your [Google Cloud console](https://console.cloud.google.com/).
1. In the **Search** bar, search for **BigQuery**.
1. In the **Explorer** panel, expand your project (for example, `datadog-sandbox`).
1. Select **Datasets**, then click **Create dataset**.
   {% image
      source="https://docs.dd-static.net/images/product_analytics/experiment/exp_bq_gc_create_dataset.a8e3bfc798ef8d9dbf75ca5db16c7739.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/product_analytics/experiment/exp_bq_gc_create_dataset.a8e3bfc798ef8d9dbf75ca5db16c7739.png?auto=format&fit=max&w=850&dpr=2 2x"
      alt="The BigQuery Datasets page in the Google Cloud console showing the datadog-sandbox project expanded in the left Explorer menu with Datasets selected, a list of datasets with columns for Dataset ID, Type, Location, Create time, and Label, and the Create dataset button highlighted in the top right." /%}
1. Enter a **Dataset ID** (for example, `datadog_experiments_output`).
1. (Optional) Select a **Data location** from the dropdown, add **Tags**, and set **Advanced options**.
1. Click **Create dataset**.

### Create a Cloud Storage bucket{% #create-a-cloud-storage-bucket %}

Create a Cloud Storage bucket that Datadog Experiments can use to stage experiment exposure records. See Google's [Create a bucket](https://docs.cloud.google.com/storage/docs/creating-buckets#console) documentation.

## Step 2: Grant permissions to the Datadog service account{% #step-2-grant-permissions-to-the-datadog-service-account %}

The Datadog Experiments service account requires specific permissions to run warehouse-native experiment analysis.

### Assign IAM roles at the project level{% #assign-iam-roles-at-the-project-level %}

To assign IAM roles so Datadog Experiments can read and write data, and run jobs in your data warehouse:

1. Open your [Google Cloud console](https://console.cloud.google.com/) and navigate to **IAM & Admin** > **IAM**.
1. Select the **Allow** tab and click **Grant access**.
1. In the **New principals** field, enter the service account email.
1. Using the **Select a role** dropdown, add the following roles:
   1. [BigQuery Job User](https://docs.cloud.google.com/iam/docs/roles-permissions/bigquery#bigquery.jobUser): Allows the service account to run BigQuery jobs.
   1. [BigQuery Data Owner](https://docs.cloud.google.com/iam/docs/roles-permissions/bigquery#bigquery.dataOwner): Grants the service account full access to the Datadog Experiments output dataset.
   1. [Storage Object User](https://docs.cloud.google.com/iam/docs/roles-permissions/storage#storage.objectUser): Allows the service account to read and write objects in the storage bucket that Datadog Experiments uses.
   1. [BigQuery Data Viewer](https://docs.cloud.google.com/iam/docs/roles-permissions/bigquery#bigquery.dataViewer): Allows the service account to read tables used in warehouse-native metrics.
1. Click **Save**.

{% image
   source="https://docs.dd-static.net/images/product_analytics/experiment/exp_bq_gc_iam_role.fd4b6118392be7ee8ec919cd289f4a06.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/product_analytics/experiment/exp_bq_gc_iam_role.fd4b6118392be7ee8ec919cd289f4a06.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The Google Cloud IAM page showing the Grant access panel for a project, with the Grant access button highlighted on the left, a New principals field highlighted in the Add principals section, and a Select a role dropdown highlighted in the Assign roles section." /%}

### Grant read access to specific source tables{% #grant-read-access-to-specific-source-tables %}

Repeat the following steps for each dataset you plan to use for experiment metrics:

1. In the [Google Cloud console](https://console.cloud.google.com/) **Search** bar, search for **BigQuery**.
1. In the **Explorer** panel, expand your project (for example, `datadog-sandbox`).
1. Click **Datasets**, then select the dataset containing your source tables.
1. Click the **Share** dropdown and select **Manage permissions**.
   {% image
      source="https://docs.dd-static.net/images/product_analytics/experiment/exp_bq_gc_permissions.df815d0c5c9d3e75f5a762e95a47f089.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/product_analytics/experiment/exp_bq_gc_permissions.df815d0c5c9d3e75f5a762e95a47f089.png?auto=format&fit=max&w=850&dpr=2 2x"
      alt="The BigQuery dataset page with the Share dropdown expanded and Manage permissions highlighted, showing additional options including Copy link, Authorize Views, Authorize Routines, Authorize Datasets, Manage Subscriptions, and Publish as Listing." /%}
1. Click **Add principal**.
1. In the **New principals** field, enter the service account email.
1. Using the **Select a role** dropdown, select the **BigQuery Data Viewer** role.
1. Click **Save**.

## Step 3: Configure experiment settings{% #step-3-configure-experiment-settings %}

{% alert level="info" %}
Datadog supports one warehouse connection per organization. Connecting BigQuery replaces any existing warehouse connection (for example, Snowflake).
{% /alert %}

After you set up your Google Cloud resources and IAM roles, configure the experiment settings in Datadog:

1. Open [Datadog Product Analytics](https://app.datadoghq.com/product-analytics/).
1. In the left navigation, hover over **Settings** and click **Experiments**.
1. Select the **Warehouse Connections** tab.
1. Click **Connect a data warehouse**. If you already have a warehouse connected, click **Edit** instead.
1. Select the **BigQuery** tile.
1. Under **Select BigQuery Account**, enter:
   - **GCP service account**: The service account you are using for Datadog Experiments.
   - **Project**: Your Google Cloud project.
1. Under **Dataset and GCS Bucket**, enter:
   - **Dataset**: The dataset you created in Step 1 (for example, `datadog_experiments_output`).
   - **GCS Bucket**: The Cloud Storage bucket you created in Step 1.
1. Click **Save**.

{% image
   source="https://docs.dd-static.net/images/product_analytics/experiment/guide/bigquery_experiment_setup_dd.a2f5683641daec29ffe3bf8dbd6fe2e3.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/product_analytics/experiment/guide/bigquery_experiment_setup_dd.a2f5683641daec29ffe3bf8dbd6fe2e3.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="The Edit Data Warehouse modal with BigQuery selected, showing two sections: Select BigQuery Account with fields for GCP service account and Project, and Dataset and GCS Bucket with fields for Dataset and GCS Bucket." /%}

After you save your warehouse connection, [create experiment metrics](https://docs.datadoghq.com/experiments/defining_metrics.md) using your BigQuery data.

## Further reading{% #further-reading %}

- [Defining metrics in Datadog Experiments](https://docs.datadoghq.com/experiments/defining_metrics.md)
- [How to bridge speed and quality in experiments through unified data](https://www.datadoghq.com/blog/experimental-data-datadog/)
