Ask a PM: AMA with Vlado Hruda, Sr. Product Manager at Mixpanel

  • 18 September 2020
  • 4 replies
Ask a PM: AMA with Vlado Hruda, Sr. Product Manager at Mixpanel
Userlevel 1

In this edition of Ask a PM, we brought you an exclusive AMA-style webinar with Vlado Hruda, Mixpanel’s very own Senior Product Manager (formerly of Google), moderated by Product Designer, Katie Siu.

Vlado and Katie answered your burning questions on everything from finding your focus metric to leveraging product analytics to answer UX-related questions to driving retention...but they received way more questions than we could possibly answer, so we’re continuing the conversation here on QBQ.

We’ll be posting answers to some of the questions we didn’t have a chance to respond to, as well as responding to new questions posed in this thread.


And in case you missed the live session or need a quick refresher, you can find answers to all of these questions in the recording:

  • How do you prioritize what feedback is important? Do you have any frameworks you might be able to recommend? 
  • How do you then figure out what your team should tackle first?
  • Do you find prioritizing backlogs like these differs between larger companies, like Google, vs. startups?
  • How do you use analytics before a product release/launch to get design feedback? What happens when you have a smaller user base to pool feedback from?
  • How do you balance quantitative analytics vs. qualitative gut feelings? Have you ever found yourself going against what the data indicated? 
  • Do you have any advice around identifying and analyzing core metrics to help teams drive new user acquisition and retention? 
  • What's the best way to determine activation criteria?
  • When you get a result that you know will be controversial with stakeholders, what are your next steps? (e.g., an exec has a pet feature that doesn’t perform well)
  • What are three things you have learned over the years that you would have liked to know on your first day as a product manager?
  • What was the most valuable thing you took from your time at Google?


4 replies

Userlevel 1

Follow-up questions we did not have time to cover in the video




How do you define design decisions without having analytics at the product designing stage, as product designing is very subjective and depends on your user base? Is there any framework that can be used to base design decision?

  • I've actually found that in these situations, a super valuable analysis is Funnels. Your goal is to track adoption of each feature as a “new product” from 1. discoverability, 2. set-up, 3. aha moment, to 4. habit moment. You can benchmark desired adoption levels vs. existing features (e.g. Query Builder example), and from here, you want to understand where is the biggest gap in existing solutions or what users perceive as a missing piece. 

  • This will provide you with a stack-ranking of goals & problems you want to address that you can guide your designers with so they know if they need to double down on discoverability, or ease of use, or a particular flow in that feature. Each of these might have a very different outcome and will lead you to a different solution.

How can we find out the time spent in a particular section of our product?

  • That's relatively easy. You can create a 2-step funnel to understand how long users take to complete a set of events. If you track navigation, you can see this information by session/time. If you’re already a Mixpanel user, you can check out how this works in our demo datasets.

How do you make sure to always have time for user tests?

  • Actually, it's even more important for change management than in new features, because you can get a great sense of how much friction you are removing, and how much you are introducing for the existing users. If this is done right, it should not take more than 5 hours of work and can potentially save you 2-4 weeks of redoing work, so it's always a great time investment.

As a Product Manager, it gets easier with each day to let go of my own ideas I once thought were awesome that may not work with the direction the product is headed anymore. How do you get the idea of "killing your darlings" across to users or, more importantly, stakeholders after presenting findings that their old ideas just don't quite jive with the direction of the product?

  • The biggest learning as a PM for me was that I need to be solution-agnostic and problem-specific. Unless you have experience across Eng, Design & Domain expertise, you are statistically unlikely to discover the best solution all by yourself. As a PM you can provide the role of a guide to help your team to navigate towards the best solution. One of the best approaches is to be product-driven, and quickly try to validate multiple tactics. We usually use an ICE prioritization to understand which are the low hanging fruits we can try, and then pursue them accordingly.

  • For stakeholders, it’s important to have a social contract where they agree they care about the results or how you do it. And you should definitely push back on it. If there is too much disagreement, figure out how you can test each of these ideas, and focus on spending time validating them, instead in discussion. You need new information to help with the indecision.

  • For large projects, if you get a sense you cannot be objective in your judgment, you might want to ask another PM to step in if necessary.

Which feature(s) in Mixpanel is best for gaining UX insights?

  • It’s actually a combined use of Funnels, Flows and Retention reports for discovering activation moment and habit moment. Once you understand the path in Flows where users are trying to go and why they get stuck, you can figure out which is the biggest roadblock preventing them from being successful. You can then optimize this path via Funnels. For instance, once we optimized AI feature adoption by 500% by just focusing on the discoverability of the CTA and optimizing the content for consumption on the right device.

  • Also, if you are focused on optimizing just adoption, insights can be a great resource to benchmark your adoption with other similar UI controls within the product to better understand what’s the size of the discoverability gap.

What are your favorite tricks for creating user stories within Mixpanel that most users don't know about?

  • The best solution is actually getting real customer quotes. There is nothing more powerful than if you can bring the real customer story to your team and have everyone think of the challenges that a user is facing in that particular situation. We've found making zoom recordings of customer calls very powerful, as we can always go back to the video to gather more detail on a particular topic. Also, we have a great process to collect customer gaps that helps us collect this information at scale.


Userlevel 1
Badge +2

Hi ​​​​@vlado, what’s your favorite way to analyze product changes and their effects on metrics?

I’ve tried manually adding annotations on dates we make changes, but I’m curious if there’s a more “seamless” method.  (I know the impact report helps, but not all current changes are applicable as launch events because the more minor changes are not recorded as events)

Userlevel 1

Hi @arshia93, thank you for submitting your question! In a perfect world you would like to release every feature as an experiment, so you can directly measure its impact on key metrics and make a call whether to go fully GA or revisit the feature before releasing it further.

Also, the best practice is to make your tracking MECE (Mutually Exclusive Collectively Exhaustive), so maybe it’s a question of a workflow where teams forget to add new tracking where a feature is launched. In general we have a rule in Mixpanel that you cannot proceed to a GA without tracking in place so this can help prevent any omissions. 

Userlevel 1
Badge +2

Thanks for the response @vlado, the only part that makes this tricky is for startups without the ability to reach a high statistical significance. 

In the meantime, we launch a/b tests on more significant changes (ie price) but for other less major changes I’ve been comparing conversion data from before/after the change.