the numbers in Experiments>Overview>Exposures doesn’t correspond to the actual number of people who did Experiment Started for this experiment since the start date
There are a few documented reasons why the Exposures count in the Experiments Overview might not match what you'd expect from a raw count of $experiment_started events:
Possible Causes
1. How the Analysis Engine Works
The Experiment report uses borrowed properties to attribute user behavior to variants. It identifies events performed after exposure and borrows variant details from the $experiment_started event. The exposure count reflects users analyzed through this mechanism, which may differ from a simple event count. [Experiments analysis]
2. Experiment Start Date
The Experiment report sets the Experiment Start Date as the date of the first user exposed. If you're comparing against a different date range, the counts may not align. [Experiments setup]
3. Sample Ratio Mismatch (SRM)
If the distribution of users across variants looks off, this could indicate a Sample Ratio Mismatch — a sign of bugs in assignment logic, tracking implementation, or user bucketing. Mixpanel runs a Chi-squared test to detect this and will show a warning if detected. [Experiments health checks]
4. Exposure Event Tracking Issues
Common implementation problems that can cause discrepancies include:
Exposure events not firing consistently across variants
Users being reassigned to different variants mid-experiment
The exposure event being sent at assignment time rather than at actual exposure time [Experiments health checks]
5. Data Discrepancy Debugging Tips
If you're comparing the Experiments Overview count against an Insights report querying $experiment_started, make sure both are:
Using the same time frame and timezone
Counting the same unit (unique users vs. total event count)
Applying the same filters [Debugging discrepancies]
-------------------------
Recommended next steps:
Check the Experiments health checks (SRM and Retro-AA) for any warnings.
Verify your $experiment_started events include the correct Experiment name and Variant name properties.
Confirm you're comparing unique users (not total event count) in both places.
Check if there are any late-arriving events by breaking down your Insights report by mp_processing_time_ms.
If the issue persists, reaching out to Mixpanel Support with specific distinct_id examples would help pinpoint whether users are being tracked but attributed differently.
ℹ️ Tip: Mention @kapa.ai in the thread for follow-ups or DM the app for 1-to-1 help.
Given that you're already counting unique users with the same date ranges and the same experiment/variant names, the knowledge sources don't explicitly explain the exact mechanism behind the Exposures count in the Experiments Overview versus a raw unique user count of $experiment_started in another report. However, based on what the docs do say, here are the most likely remaining causes: 1. Exposure Event Firing Consistency The Experiment report may only count users whose $experiment_started event fired consistently and correctly with the required properties (Experiment name and Variant name). If some events were tracked with incorrect casing or slight naming differences (Mixpanel is case-sensitive), those users could be excluded from the Experiments Overview but still appear in a broader event query. [Experiments analysis] 2. Custom Experiment Event Settings If your project has configured a custom exposure event (instead of the default $experiment_started) in Project Settings under Experiment Event Settings, the Experiments Overview would use that custom event — meaning a query on $experiment_started directly might not match. [Experiments analysis] 3. Users Reassigned Mid-Experiment If users were reassigned to different variants mid-experiment, the analysis engine splits their behavior fractionally across variants, which could affect how exposures are counted. [Experiments analysis] 4. Late-Arriving Data If some $experiment_started events arrived late (imported or delayed), they may appear in a raw event query but not yet be reflected in the Experiments report depending on when you viewed each. [Delayed ingestion] ------------------------- Beyond these points, the knowledge sources don't contain enough detail about the internal Exposures calculation to explain further discrepancies. I'd recommend contacting Mixpanel Support with specific distinct_id examples of users who appear in one count but not the other — that's the most reliable way to pinpoint the root cause.
ℹ️ Tip: Mention @kapa.ai in the thread for follow-ups or DM the app for 1-to-1 help.
The knowledge sources don't contain enough specific information to explain why the Experiments Overview Exposures count would be larger than a direct unique user count of $experiment_started events.
The documented causes I mentioned previously (custom event settings, late-arriving data, etc.) would more typically explain a lower count in the Overview, not a higher one.
One possibility the docs do support: if a user is reassigned to a different variant mid-experiment, the analysis engine splits their behavior across variants — this could theoretically result in a user being counted in multiple variant exposure tallies, inflating the total. [Experiments analysis]
Beyond that, I'd strongly recommend contacting Mixpanel Support directly with:
Specific distinct_id examples of users who appear in the Exposures count but not in your $experiment_started query (or vice versa)
Screenshots of both reports with identical date ranges and filters
This is the most reliable way to identify the root cause, as the knowledge sources don't provide enough detail about the internal Exposures calculation to explain a higher count with confidence.
ℹ️ Tip: Mention @kapa.ai in the thread for follow-ups or DM the app for 1-to-1 help.
The knowledge sources don't contain specific information about whether the Experiments Overview Exposures section has a drill-down or "View Users" capability.
What the docs do mention is that for regular Mixpanel reports, you can click 'Analyze' on a metric in the Experiments report to open a standard Insights report with the experiment breakdown applied — and from there you can view users, view replays, or apply additional breakdowns. [Experiments in reports]
As a workaround to investigate the discrepancy, you could:
Use the 'Analyze' button on a metric in the Experiments report to open it in Insights with the experiment filters pre-applied.
Manually build an Insights report querying $experiment_started, filtered by your Experiment name and Variant name, counting unique users — and compare that directly to the Exposures number.
However, for a definitive explanation of why the Exposures count is higher than your manual query, I'd still recommend reaching out to Mixpanel Support directly, as the knowledge sources don't contain enough detail about the internal Exposures calculation to explain this with confidence.
ℹ️ Tip: Mention @kapa.ai in the thread for follow-ups or DM the app for 1-to-1 help.
