Sticky MPKNOWLEDGEDROP

Impact Report Best Practices

  • 16 January 2020
  • 2 replies
  • 668 views
Impact Report Best Practices
Userlevel 1

Every week, we will release tips to help you get the most out of Mixpanel. Want to see more? Click here to see other #mpknowledgedrop articles.

 

Impact Report Best Practices

 

Hi everyone!  Brandon, one of Mixpanel’s PMs here.  As you may have heard, we recently released a new report in Mixpanel, the Impact Report.  Impact, as the name implies, helps you assess the success of your product launches, by showing how said launches influence user behavior in terms of your KPIs.

The Impact Report can be fairly complex, so in addition to explaining how it works in our documentation, I wanted to offer some tips for how to get the most value out of the report.

 

Pick the right launch date & time interval

In Impact, there are two date-related options to configure.  First is the launch date, which appears in the first section of the query builder.  Second is the time interval, which appears at the top left of the chart visualization.  Both of these affect the Impact query, so it’s important to choose wisely.

hBd35Q4ojrDaqYZTi6IxiEQJkV73pzDpUd9G61gZZUcxdWftRPC3SrMrQHWOmXaRHoTEXbJj6nEs0HI1RQbOMSrVX4uJweKsSyY_iiRPAsSnlZFCQTzBEMUYlIraq-CA1EGgSeR3

For the launch date, you’ll want to pick the date where the feature you’re analyzing became available to 100% of your users.  While it’s technically possible to use the Impact report on features that have been released to only a portion of your user base, it will produce the most accurate results when the sample size is highest.  That, of course, is when the feature is available to everyone.

However, for launch dates more than 90 days in the past, we recommend adjusting the date range to be “between” the original launch date, and 90 days after the launch date.  The reason here is because, long after the feature has been released, there are likely more and more factors that influence user behavior (like other product and marketing launches).  So, it becomes harder to attribute metric changes to your launch.

For the time interval, you’ll want to set a value that’s less than or equal to the number of days since your launch.  If you launched a week ago, for instance, choose 7 or fewer days for the time interval. Otherwise, a large section of your report may drop to zero, simply because that time is in the future.

 

Choose a high-frequency impacted event

Impact calculates the frequency at which adopters and non-adopters perform your impacted events via the following formula:

# of impacted events performed that day / # of users in the adopter/non-adopter group that day

Then, we take the average of the above value over the course of many days.  So, as you can see, if users only rarely perform your impacted event (e.g. something like “Purchase Yearly Subscription”), then you’ll get lots of zeros when no one in the group has performed the event, and spikes when a few users did.  This spikiness makes it hard to calculate a consistent, stable average, which in turn reduces the reliability of the Impact results.

Therefore, most of the time, you’ll want to tend to analyze metrics that represent user engagement - events like “Browsed Item, Opened App, Sent Message, Requested Ride.”  These types of events track when users find value in your product, and assuming your product is valuable, users will tend to perform them fairly often. When you choose these types of events as your impacted events, you’ll get the most value out of the Impact report.

 

Analyze the results at the right time

While it may be tempting to immediately dive into the Impact report just after you’ve launched your new feature, we recommend exercising some patience.  As mentioned before, the Impact report needs a good sample size to produce the best results - the more users, the better. So, give those users an opportunity to try out the new feature.  We’ve found that waiting about a week after the launch tends to produce a statistically confident result.

 

Try Impact today!

I hope these tips are helpful as you dive into the Impact Report!  As always, if you have any feedback, feel free to comment on this thread, or submit ideas in the feedback section.  

 

Best,

Brandon

Product Manager @ Mixpanel


2 replies

Hi @brandon Thanks for the tipp over here. I am trying to measure the impact of a change in our product on a specific event. We struggle to identify the launch event as it is not an event that gets clicked or performed by a user. We also did not see yet a reason to trigger it for every user. 

Let me give you an example: Imagine we change all border colours from grey to orange. And there are no feature changes. We want to know what impact this had on a very well defined impact event. Is it possible to use the impact report for this case? If yes, how should the setup be? 

Userlevel 1

Hi @Irfan, thanks for the question!  Unfortunately, the Impact report isn’t an ideal tool for analyzing something like a color change, because your users don’t have a choice of whether or not to accept those color differences.

The Impact report relies on users choosing to use a new feature.  Users that use a new feature are adopters; users that don’t, are non-adopters.  We then compare the differences in behavior between those two user groups to determine the impact of the new feature.  That’s why a Launch Event is required: we need to know whether or not a user took the action to use a new feature.

However, for something like a color change, users don’t have a choice but to adopt the new colors, so everyone is an adopter.  That means we can’t compare their behavior to non-adopters, and therefore can’t determine the impact of the change.

Instead, I would recommend a traditional A/B test for this type of change.  Give 50% of users the orange colors, and 50% the gray, and see which group performs your impacted event more often.

Sorry this isn’t the answer you’re looking for, but I hope this helps give you some insight into how the Impact report works, and the scenarios it can help address.

Reply