This product is not supported for your selected
Datadog site. (
).
Overview
If you experience issues setting up or running Datadog Experiments, use this page to troubleshoot. If you continue to have trouble, contact Datadog support.
Experiment results do not appear
If experiment results are missing after you launch an experiment, start by checking whether the experiment is assigning users. Then, navigate to the appropriate troubleshooting step.
Step 1: Confirm the experiment is assigning users
On the Experiments page, select your experiment to open its detail page. Hover over the values of the metric scorecard:
- If the subject assignment count for each variant is zero, go to Step 2 to debug traffic.
- If the subject assignment count is greater than zero, but the metric values are zero, skip to Step 3.
In the following example, the subject assignment count is 12,427 for Variant A and 12,573 for Variant B.
If your metric scorecard shows N/A for metric values or subject assignment counts, this means the analysis has not run yet. Wait for it to run, then continue with the troubleshooting steps as needed.
Step 2: Confirm the experiment is receiving traffic
Verify that your feature flag is enabled and evaluates in the correct environment. Then, confirm that traffic reaches the experiment’s targeting rule.
On the Experiments page, select your experiment to open its detail page. Hover over the experiment flag label (for example, new-product-photos).
Note the Environment where the experiment is running, then click Go to Flag.
On the Feature Flags page, select the correct environment tab and confirm that the flag is Enabled. If the flag is disabled, enable it before proceeding.
In the Real-time metric overview section, confirm that the bar chart shows exposure events.
Based on what you see in the Real-time metric overview, follow the appropriate path:
The flag is not receiving traffic
If the Real-time metric overview section shows no exposure events, the flag is not receiving traffic from your application.
Confirm the flag is enabled in the environment where your application runs. You can manage environments on the Environments page. See the Getting Started with Feature Flags guide for details on environments.
After you enable the flag, check the Real-time metric overview for incoming exposure events. Then, return to Step 1 to verify that the subject assignment count is increasing.
The flag is receiving traffic but experiment assignments are zero
If the flag shows exposures but the metric scorecard shows zero assignments, traffic is not reaching the experiment’s targeting rule.
The Targeting Rules & Rollouts section displays a list of targeting rules that the flag evaluates from top to bottom. Rules above the experiment’s targeting rule, such as rules that exclude internal users or specific organizations, can capture traffic before it reaches the experiment.
Check the following and edit the targeting rule and traffic exposure as needed:
- Targeting rule order: Are targeting rules above the experiment capturing traffic before it reaches the experiment rule?
- Targeting rule filters: Does incoming traffic match the filters in the experiment’s targeting rule?
- Traffic exposure: Is the traffic exposure for the targeting rule set correctly?
After making the necessary changes, return to Step 1 to verify that the subject assignment count is increasing.
Step 3: Confirm metric events are firing
If users are assigned to the experiment but the metric values are missing, check whether the assigned users have associated metric events.
Work through the following checks in order. Each builds on the previous one, so continue to the next if the issue persists.
A metric event must meet two criteria for Datadog to include it in experiment results:
- The event must come from a user with at least one experiment exposure event.
- The event must occur after the user's first experiment exposure.
From the metric scorecard you checked in Step 1, hover over the metric with a zero value and click the ⋮ menu icon. Select Edit Metric to open the metric definition page.
Verify that the metric event name is correct (for example, check for typos). Then, review the event volume chart on the right side of the page to confirm the event is firing.
If the event is firing but metric values are still zero, the metric events may not be matching to experiment exposures. Continue to the next section to verify that the exposure and metric identifiers match.
Datadog matches metric events to experiment exposures using a set of identifiers:
If the targetingKey in your SDK does not match the subject type attribute configured in Datadog, the experiment cannot associate metrics with users.
On the Experiments page, select your experiment to open its detail page.
Select the Flag & Exposures tab. Then, click View Exposures Log to see a list of recently exposed subjects. For details on how exposure events are tracked, see the SDK documentation.
The Subject column shows the value your SDK passes as targetingKey. Confirm that the targetingKey in your SDK matches the subject type attribute (for example, @usr.id). If these identifiers do not match, update them before proceeding.
To resolve a mismatch, update either the targetingKey in your SDK or the attribute on the Subject Types page so that both use the same identifier.
If identifiers match and users are assigned to the experiment but experiment results are still missing, inspect individual sessions to identify why specific users are not generating metric events.
On the Activity Stream page, filter for experiment sessions using the following syntax:
@feature_flags.<flag-key>:<variant-value>
For example, to filter sessions for the false variant of the new-product-photos flag, use @feature_flags.new-product-photos:false.
Select a session from a user assigned to the experiment. In the session timeline, check for the following:
- Is the metric event present? Verify that the expected metric event is firing within the session.
- Does the metric event occur after the feature flag evaluation? Events that occur before the feature flag evaluates do not count toward experiment results.
If the metric event is missing or fires before the feature flag evaluation, contact Datadog support with the session URL, the experiment name, and the metric event name.
If you have confirmed that metric events are firing and identifiers match, but metric values are still zero, outlier handling may be the cause.
When outlier handling is enabled, Datadog calculates a threshold based on the distribution of metric values across users. If the number of users with a metric event is small, Datadog may compute the threshold as zero, which truncates all metric values to zero.
To check if outlier handling is the cause:
- On the Experiments page, select your experiment to open its detail page.
- Hover over the metric name, click the ⋮ menu icon, and select Edit Metric to open the metric definition page.
- Expand the Experiment settings accordion. Under Outlier handling, toggle off both Lower bound percentile and Upper bound percentile.
- Save the metric.
- To trigger an immediate recompute, go to the Metrics section of the experiment’s
detail page. Click the ⋮ menu icon next to Last Updated and select run an update now. Otherwise, wait for the next scheduled update.
If metric values appear after disabling outlier handling, the threshold was truncating your data. To resolve this, keep outlier handling disabled or set a higher threshold on the Edit Metric page.
If the issue persists after completing all checks, contact the Datadog support team.
Further reading
Additional helpful documentation, links, and articles: