How to do A/B test with Mixpanel?
Hi Yanisa S., Please stand by as I search the Mixpanel knowledge base for an AI-powered answer. I normally take about 15 seconds or so. In case AI cannot answer your question, feel free to reach out to Mixpanel support at mixpanel.com/get-support. If you have follow-up questions for this thread, please tag me again with your question: @kapa.ai <follow-up question> đź‘‹
Feel free to ask a follow-up question once I've answered by mentioning me with @kapa.ai
Here’s the shortest path to run and measure A/B tests with Mixpanel: 1) Implement exposure tracking
- Send an exposure event when a user is actually shown a variant (not just assigned), once per experiment exposure: LBKSmixpanel.track('$experiment_started', { 'Experiment name': 'Your Experiment', 'Variant name': 'Control or Variant A/B' })LBKSOnly string properties for name/variant. You can remap to a different exposure event/property set in Project Settings > Experiment Event Settings (useful if you’re using a 3rd-party flagging tool) (Adding Experiments to an Implementation). 2) Use the Experiments report (Enterprise)
- Go to Experiments, select your experiment (auto-detected for starts in the last 30 days or hard-code older names), choose the control variant, select success and guardrail metrics, set sample size or minimum days, and review defaults (Sequential model, 95% confidence) (Experiments overview).
- Under the hood, Mixpanel attributes events to variants using borrowed properties from $experiment_started, supports users switching variants by fractional attribution, and treats users as exposed for up to 90 days post-exposure (How the engine works). 3) If using AB Tasty (no-code setup)
- Parse AB Tasty’s “Event” -> “AB Tasty” property into two Mixpanel Custom Properties, then map them in Experiment Event Settings so your tests appear natively in Mixpanel’s dropdowns and reports (AB Tasty Events in Mixpanel). 4) Best practices
- Define a primary metric and guardrails; estimate sample size/MDE; use feature flags to control exposure; QA before launch; analyze significance and segment by cohorts when reading results (Step-by-step guide).
- Mixpanel’s Experiments is for analysis; for test execution/flagging, use a leading A/B tool that integrates with Mixpanel (e.g., AB Tasty) to maintain performance and privacy standards (Why you need a strong A/B testing solution; Mixpanel + AB Tasty). Notes
- Experiments report is a separately priced Enterprise product (Experiments overview).
- The legacy Experiments app is deprecated Nov 1, 2025—use the new Experiments report instead (Experiments app deprecation).
kapa.ai is still learning and improving, please let me know how I did by reacting below.
ℹ️ Tip: Mention @kapa.ai in the thread for follow-ups or DM the app for 1-to-1 help.
