how can I export the data which exceed the UI limit?
Hi Kübra B., Please stand by as I search the Mixpanel knowledge base for an AI-powered answer. I normally take about 15 seconds or so. In case AI cannot answer your question, feel free to reach out to Mixpanel support at mixpanel.com/get-support. If you have follow-up questions for this thread, please tag me again with your question: @kapa.ai <follow-up question> 👋
Feel free to ask a follow-up question once I've answered by mentioning me with @kapa.ai
You have a few options depending on what you’re exporting and why it exceeds the UI limits:
1. Export full raw events (bypass report segment limits)
UI segmentation limits don’t apply to raw exports. Use the Raw Event Export API to pull all underlying events as JSONL, filtered by date and (optionally) event name or a where expression.[Export methods]Example: LBKScurl --request GET \ --url '[https://data.mixpanel.com/api/2.0/export?project_id=<PROJECT_ID](https://data.mixpanel.com/api/2.0/export?project_id=<PROJECT_ID) &from_date=2024-07-23&to_date=2024-07-24>' \ --header 'accept: text/plain' \ --header 'authorization: Basic <BASE64_SERVICE_ACCOUNT_CREDENTIALS> 'LBKSNotes: - Returns all matching events; UI segment cardinality limits don’t apply.[Raw export spec]
- Rate limit: 60 queries/hour, 3/sec, 100 concurrent.[Raw export spec]For large time ranges, use varied-length scripts like the sample that queries big, low‑volume periods in larger chunks and high‑volume periods in smaller daily ranges.[Varied date ranges]
2. Export formatted report data via Query API (higher limits than UI view)
If you want the aggregated report output (not raw events), Growth/Enterprise projects can use the Query API (Insights/Funnels/Retention APIs) to pull the same numbers that you see in the UI, without the visual selection limits.[Exporting reports]This does not “remove all limits,” but avoids some UI interaction constraints like segment selection; exact row caps for Query API aren’t detailed in the provided docs.
3. Use CSV exports from reports (higher row limits than on-screen)
For Insights and other reports, CSV export has a higher row cap than what you see on screen: - Most reports: 10,000 rows - Insights non‑line charts (Bar, Stacked Bar, Pie, Table, Metric): up to 50,000 rows.[Segment limits]If your UI view is truncated due to segment cardinality, CSV often returns more segments than the on‑screen visualization.
4. Continuous full export via Data Pipelines (warehouse / cloud storage)
If you need all events and profiles on an ongoing basis and your data volumes are large, use Data Pipelines (paid add‑on for Growth/Enterprise): - Continuously exports events, profiles, and identity mappings to S3/GCS/Azure or to warehouses (BigQuery, Snowflake, Redshift, etc.).
- Events go to paths like:
<BUCKET_NAME> /<MIXPANEL_PROJECT_ID> /mp_master_event/<YEAR> /<MONTH> /<DAY> /.[Data Pipelines overview]This fully bypasses report and UI export limits and is best for large‑scale analysis in SQL.
5. Export profiles without UI limits
For user or group profiles, skip the UI limits by using the Engage API, which returns paged JSON results of profiles.[Exporting profiles]---In practice: - If you’re hitting segment/cardinality limits in a report, try CSV export first, then Query API, or drop to Raw Event Export if you need everything. - If you need ongoing, complete dumps at scale, use Data Pipelines.
kapa.ai is still learning and improving, please let me know how I did by reacting below.
ℹ️ Tip: Mention @kapa.ai in the thread for follow-ups or DM the app for 1-to-1 help.
I use it but it returns nothing 😢 I was looking for other ways. but I think there is an error in my script which I will solve soon 🤞
That's the best way to do this.
Makes sense. I realized I was treating a user property as an event property. Now, I get distinct_id as an event property, and I will map the corresponding user property by mapping via engage api. Is this the right approach? I’ve been trying this for 2-days-range but it’s extremely slow. Eventually, I’ll need to export a YTD dataset. Is there a way to make this process faster?
Yes, that's the right approach. Ideally, I've seen it to be pretty fast. How much data do you have in the last 2 days?
