---
title: Connect your LLM provider account
description: >-
  How to connect to your LLM provider account to support judge LLM based
  evaluations
breadcrumbs: >-
  Docs > LLM Observability > Evaluations > Custom LLM-as-a-Judge Evaluations >
  Connect your LLM provider account
---

# Connect your LLM provider account

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com

{% alert level="danger" %}
This product is not supported for your selected [Datadog site](https://docs.datadoghq.com/getting_started/site). ().
{% /alert %}

{% /callout %}

## Connect your LLM provider account{% #connect-your-llm-provider-account %}

Configure the LLM provider you would like to use for bring-your-own-key (BYOK) evaluations. You only have to complete this step once.

{% tab title="OpenAI" %}

{% alert level="danger" %}
If you are subject to HIPAA, you are responsible for ensuring that you connect only to an OpenAI account that is subject to a business associate agreement (BAA) and meets all requirements for HIPAA compliance.
{% /alert %}

Connect your OpenAI account to LLM Observability with your OpenAI API key. LLM Observability uses the `GPT-4o mini` model for evaluations.

1. In Datadog, navigate to [**LLM Observability > Settings > Integrations**](https://app.datadoghq.com/llm/settings/integrations).
1. Select **Connect** on the OpenAI tile.
1. Follow the instructions on the tile.
   - Provide your OpenAI API key. Ensure that this key has **write** permission for **model capabilities**.
1. Enable **Use this API key to evaluate your LLM applications**.
1. LLM Observability requires that the `complete/chat` API endpoint be available for the selected model. See [OpenAI's model overview page](https://developers.openai.com/api/docs/models) for details about which models support this endpoint.

{% image
   source="https://datadog-docs.imgix.net/images/llm_observability/configuration/openai-tile.20f9a9ec33deec8cd3b7a031d38c258b.png?auto=format"
   alt="The OpenAI configuration tile in LLM Observability. Lists instructions for configuring OpenAI and providing your OpenAI API key." /%}

LLM Observability does not support [data residency](https://platform.openai.com/docs/guides/your-data#which-models-and-features-are-eligible-for-data-residency) for OpenAI.
{% /tab %}

{% tab title="Azure OpenAI" %}

{% alert level="danger" %}
If you are subject to HIPAA, you are responsible for ensuring that you connect only to an Azure OpenAI account that is subject to a business associate agreement (BAA) and meets all requirements for HIPAA compliance.
{% /alert %}

Connect your Azure OpenAI account to LLM Observability with your OpenAI API key. Datadog strongly recommends using the `GPT-4o mini` model for evaluations. The selected model version must support [structured output](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/structured-outputs), and the Chat Completions API must be available. See a [full list of compatible models](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/models-sold-directly-by-azure?tabs=global-standard-aoai%2Cglobal-standard&pivots=azure-openai).

1. In Datadog, navigate to [**LLM Observability > Settings > Integrations**](https://app.datadoghq.com/llm/settings/integrations).
1. Select **Connect** on the Azure OpenAI tile.
1. Follow the instructions on the tile.
   - Provide your Azure OpenAI API key. Ensure that this key has **write** permission for **model capabilities**.
   - Provide the Resource Name, Deployment ID, and API version to complete integration.

{% image
   source="https://datadog-docs.imgix.net/images/llm_observability/configuration/azure-openai-tile.e9eda3d31c231e453daef891671ec93c.png?auto=format"
   alt="The Azure OpenAI configuration tile in LLM Observability. Lists instructions for configuring Azure OpenAI and providing your API Key, Resource Name, Deployment ID, and API Version." /%}

{% /tab %}

{% tab title="Anthropic" %}

{% alert level="danger" %}
If you are subject to HIPAA, you are responsible for ensuring that you connect only to an Anthropic account that is subject to a business associate agreement (BAA) and meets all requirements for HIPAA compliance.
{% /alert %}

Connect your Anthropic account to LLM Observability with your Anthropic API key. LLM Observability uses the `Haiku` model for evaluations.

1. In Datadog, navigate to [**LLM Observability > Settings > Integrations**](https://app.datadoghq.com/llm/settings/integrations).
1. Select **Connect** on the Anthropic tile.
1. Follow the instructions on the tile.
   - Provide your Anthropic API key. Ensure that this key has **write** permission for **model capabilities**.

{% image
   source="https://datadog-docs.imgix.net/images/llm_observability/configuration/anthropic-tile.d9769847394ba0351ed9ae950c4dd041.png?auto=format"
   alt="The Anthropic configuration tile in LLM Observability. Lists instructions for configuring Anthropic and providing your Anthropic API key." /%}

{% /tab %}

{% tab title="Amazon Bedrock" %}

{% alert level="danger" %}
If you are subject to HIPAA, you are responsible for ensuring that you connect only to an Amazon Bedrock account that is subject to a business associate agreement (BAA) and meets all requirements for HIPAA compliance.
{% /alert %}

Connect your Amazon Bedrock account to LLM Observability with your AWS Account. LLM Observability uses the `Haiku` model for evaluations.

1. In Datadog, navigate to [**LLM Observability > Settings > Integrations**](https://app.datadoghq.com/llm/settings/integrations).

1. Select **Connect** on the Amazon Bedrock tile.

1. Follow the instructions on the tile.

   {% image
      source="https://datadog-docs.imgix.net/images/llm_observability/configuration/amazon-bedrock-tile.afc83f52f13c6311f781caccb2201aaa.png?auto=format"
      alt="The Amazon Bedrock configuration tile in LLM Observability. Lists instructions for configuring Amazon Bedrock." /%}

1. Be sure to configure the **Invoke models from Amazon Bedrock** role to run evaluations. More details about the InvokeModel action can be found in the [Amazon Bedrock API reference documentation](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html).

   {% image
      source="https://datadog-docs.imgix.net/images/llm_observability/configuration/amazon-bedrock-tile-step-2.97f602085810fadf2d237d5d62a2f1a3.png?auto=format"
      alt="The second step in configuring Amazon Bedrock requiring users to add permissions to the integration account." /%}

{% /tab %}

{% tab title="GCP Vertex AI" %}

{% alert level="danger" %}
If you are subject to HIPAA, you are responsible for ensuring that you connect only to a Google Cloud Platform account that is subject to a business associate agreement (BAA) and meets all requirements for HIPAA compliance.
{% /alert %}

Connect Vertex AI to LLM Observability with your Google Cloud Platform account. LLM Observability uses the `gemini-2.5-flash` model for evaluations.

1. In Datadog, navigate to [**LLM Observability > Settings > Integrations**](https://app.datadoghq.com/llm/settings/integrations).
1. On the Google Cloud Vertex AI tile, click **Connect** to add a new GCP account, or click **Configure** next to where your existing accounts are listed to begin the onboarding process.
   - You will see all GCP accounts connected to Datadog in this page. However, you must still go through the onboarding process for an account to use it in LLM Observability.
1. Follow the onboarding instructions to configure your account.
   - Add the [**Vertex AI User**](https://docs.cloud.google.com/vertex-ai/docs/general/access-control#aiplatform.user) role to your account and enable the [**Vertex AI API**](https://console.cloud.google.com/apis/library/aiplatform.googleapis.com).

{% image
   source="https://datadog-docs.imgix.net/images/llm_observability/configuration/vertex-ai-pint.9174c851c6ea22afe8475df5339ccfa0.png?auto=format"
   alt="The Vertex AI onboarding workflow. Follow steps to configure your GCP service account with the right Vertex AI permissions for use with LLM Observability." /%}

{% /tab %}

{% tab title="AI Gateway" %}

{% alert level="danger" %}
If you are subject to HIPAA, you are responsible for ensuring that you only connect to an AI Gateway that is subject to a business associate agreement (BAA) and meets all requirements for HIPAA compliance.
{% /alert %}

Your AI Gateway must be compatible with the [OpenAI API specification](https://platform.openai.com/docs/api-reference/introduction).

Connect your AI Gateway to LLM Observability with your base URL, API key, and headers.

1. In Datadog, navigate to [**LLM Observability > Settings > Integrations**](https://app.datadoghq.com/llm/settings/integrations).
1. Click the **Configure** tab, then click **New** to create a new gateway.
1. Follow the instructions on the tile.
   - Provide a name for your gateway.
   - Select your provider.
   - Provide your base URL.
   - Provide your API key and optionally any headers.

{% image
   source="https://datadog-docs.imgix.net/images/llm_observability/configuration/ai-gateway-tile-3.76d78aaa840fa1bbdd7041410de231b5.png?auto=format"
   alt="The AI Gateway configuration tile in LLM Observability. Lists instructions for configuring an ai gateway" /%}

{% /tab %}

If your LLM provider restricts IP addresses, you can obtain the required IP ranges by visiting [Datadog's IP ranges documentation](https://docs.datadoghq.com/api/latest/ip-ranges/), selecting your `Datadog Site`, pasting the `GET` URL into your browser, and copying the `webhooks` section.
