This week is all about using Mixpanel’s machine learning and statistical analysis to predict user behavior, understand correlation between mission critical events and value moments, and finely comb through the efficacy of a newly launched product or feature.
In other words; segmenting our users, answering “did it work?” and understanding “which events are correlated with my core segments?”.
Define groups of users who share a set of properties or have performed a particular group of events. When you create a cohort, you can use it to group and filter data in the Analysis reports.
Check out the full overview here and go ahead and create some core segments:
These users are behaving positively.
Think of your ideal user and the events they would trigger, how often do they trigger these events, is there any particular pricing plan they are on?
Write out your definition of a power user in natural terms then translate that into a cohort like the one linked here.
These are users you haven’t seen in a while or you’ve seen them but not in a meaningful way i.e. app session but no purchase/watch event. We suspect these users may be forgetting about your platform and the
You can create a cohort of this ensuring you list what they “did not” do in a specific time frame. This can also be the inverse of your power user query. So if power users did “watch video 3 times in 7 days” dormant users might “did not watch video at least 1 time in 7 days”
Here’s an example
Measures the effects of product or marketing launches on your key metrics by answering the question “did it work?” as in “did this specific feature here that we launched 31 days ago work?” (I said 31 days because it needs 30 days worth of data to produce any result.
Impact calculates the user adoption of the launch, the impact of the launch on an important event, and the differences between users that adopt the launch and those that do not. It gives it to us in a plain English summary alongside the graphs and tables that hold the detail which are all explained here.
Measure and quantify the association between correlated events. This is the place of curiosity, check every quarter at the very least!
This report will produce a list of events that are correlating with our default retention cohorts OR you very own cohort. Here you are answering “what did they do?” so you can double-down on what works and iterate on what isn’t working so well.
I always sort it by correlation strength and make sure that these events are being analyse within my funnels, flows, cohorts and more so I can prove out in other reports continuously until I reach the magical Facebook-esque formula.
The homework for this week is the same as last week because we are still answering questions.
Select a “question” that is relevant to you that you haven’t “answered” before in your own project and share the any insight from this report in the comments
e.g. I answered “how long does it take for users to onboard?” and found it contributed a 50% increase.