Lean UX Hypothesis Template for Product Managers

Lean UX Hypothesis Template

While stating the critical hypotheses around your product are already part of the early product discovery phases, it’s sometimes difficult to find the right format for executing against them. I ran into this situation just recently and decided to put together a straightforward hypothesis template which I want to share with you as a free resource.

How to use the Hypothesis Template

The hypothesis itself should be stated following the structure of what you want to build for whom and to achieve what kind of impact. A simple example would be:

We believe building a prominent “add contact” opportunity in the news stream for freshly registered users will achieve an increase of 10% WAU within this segment.

Afterwards, you have to define which qualitative or quantiataive experiments will help you to validate this hypothesis. Ideally, you can come up with an experiment which doesn’t involve your development team (hint: I’m talking about similar challenges at the 2017 Product Management Festival in Zurich).
For our example from above, one experiment could be sending out contact recommendation e-mails even earlier than originally planned to check whether the action of triggering more contact adds support our hypothesis.

Afterwards, we have to set a time constraint for our experiment(s) to run and define the target outcome at which we would consider the hypothesis to be validated.
Like:

30% Click Rate on provided contact recommendations and 50% of recipients become a WAU within 30 days.

Complementary Tools

The hypothesis template can be accompanied by the following tools. Either prior or past filling it out:

Besides the free download as a scalable .pdf, you can also grab the hypothesis template from my GDrive and duplicate it into your own account for modification purposes.

I’m curious to hear how the template works out for you. Let me know on Twitter or via E-mail!

Productish Podcast Episode 34: The relevance of Technical Excellence in Product Development

In this episode of the Productish podcast, Tim and I are discussing the importance and relevance of technical excellence in product development. Besides discussing the actual definition of the term, we also touched on specific tools and best & worst practices.

Note: This episode of Productish is in German.

Listen to it on iTunes or SoundCloud and don’t forget to subscribe via RSS or directly on iTunes.

How XING accels at Product Management

My friend, former colleague and Product Tank Hamburg partner in crime Marc Kadish recently gave a talk at the Working Product conference in Hamburg.
He shared some of the key principles and framework XING uses to define its product roadmap and level up its product management game.

It’s the best (public) summary I’ve seen so far on the topic of operational product management at XING and this is why I want to share those valuable insights from my former employer with you.
You can get the whole talk and his slides from the Working Products website but here are the key aspects of it.

All slides and images courtesy of Marc Kadish.

Impact Mapping

Product Management at XING - Impact Mapping

Quarterly Roadmap Rhythm

Product Management at XING - Quarterly Roadmap Rhythm

Design Thinking

Product Management at XING - Design SprintsRead my take on what it takes to fuck up a Design Sprint and what specific mistakes you should avoid as a Product Manager.


Data Informed Decision making

Product Management at XING - Data InformedRead my advice on how to avoid 5 common A/B split testing mistakes and which essentials you shouldn’t miss as a Product Manager.

Autonomy Through Alignment

Product Management at XING - AlignmentHere are my favorite tools for alignment including the Auftragsklärung framework used at XING.

How to calculate your User Churn Rate

Calculating user churn rates is essential if you want to maintain a continuous growth strategy. Whether it’s for the primary stakeholders, upper management of external investors: Knowing your churn rate is essential especially when you’re running a SaaS product when your revenue ties directly to the quantity of your user base.

Simply put, churn is customers you’ve lost in a specific period of time. Tracking this metric forces you to re-evaluate your product strategy (and possibly do a reset). If churn is high, something about your product is turning away users. The churn itself is of course expressed in obvious numbers from your BI team – But it’s the relative metric user churn rate which

Churn itself is of course expressed in obvious numbers from your BI team – But it’s the relative metric user churn rate which lets you compare your churn rate against forecast and competitors.

This is why I’ve created a calculation sheet which I want to share with you. It focusses on the acquisition/user base side of churn. It provides the calculation base for 12 months and exposes your monthly and overall churn rates.

As always, if you have any suggestions for modifications in the sheet, either send me an email or leave a comment in the file itself.

Sometime over the coming weeks, I’ll also create a sheet specifically for calculating payer churn – Useful when managing a freemium product. While it’s then more about your payer base instead of the overall user base, thinking in cohorts is also beneficial when e.g. evaluating upsell mechanisms within your product or upsell focused marketing campaigns.

Product Managers need to be micro Pessimists but macro Optimists

I recently heard about this attitude in a podcast interview with Raylene Yung, reporting that it reflects overall company values at Stripe.
But instead of ‘only’ applying it on a business level, I think it’s also quite a suitable framework for the split of perspectives product managers face on a regular basis.

It’s kind of an analogy to the discovery and delivery separation every product manager needs to handle. While the one (delivery) focusses on execution in regards to ongoing sprint routines, details of backlog items and hitting milestones on time, the other one (discovery) focusses on customer problems worth solving and what to build next.

From my experience, optimism is key for going into a product discovery. That doesn’t necessarily mean to be overly enthusiastic about a certain feature.
Furthermore, you should be looking forward to the digging into customer or business problems and the creative journey that you’ll identify something valuable you can build up on with your team.


On the execution side, I found the ‘right’ degree of pessimism helpful for staying on track. That doesn’t mean that you should doubt the success of your team on a regular basis, but challenging (your own) priorities and decisions at least during every sprint planning.
Don’t take yesterday’s decisions for granted – As long as corrections still fit into the overall product vision.
Remain skeptic that the path you chose for the next sprint will still be the right one when looking back at it into the retrospective. This way, you’ll stay hungry for ongoing improvement and avoid an ‘now we just need it build it’ attitude.

Your macro perspective should always be mostly positive. This will give you the confidence to question decisions on a micro level more regularly without becoming scared that you’re losing touch with the big picture.

Why Retesting your Experiments is crucial for real Optimization Impact

Did you run AB tests before during your career as a product person?
Good.

Did you manage to achieve significant results with at least one of those tests?
Great.

Did you reject the same idea when a colleague suggests it as an experiment 6+ months after you ran it?
Not good. Wait, what?

What might seem counterintuitive at first, is one of the most common mistakes conversion optimizers make throughout their careers.
The primary motivation for testing hypotheses using AB tests is to gather data to avoid decisions which are solely based on gut feeling.
You also want to reduce waste by preventing colleagues in your company from putting effort into a testing idea you already invalidated.

But what if you’re causing more harm than good with this behavior? I recently had the chance to attend a talk from Willem Isbrucker, Senior Product Owner at Booking.com during our Product Tank Hamburg event and got a valuable lesson taught.

When he presented the audience with an AB test setup he ran and asked the audience for the winner, 80% in the room predicted the right version (variant). But then he revealed that the 20% who considered the control version to win was also right in a certain way. The experiment also ran a year ago with the opposite result.
While you could assume a mistake in the analytics, the real reason behind it was that the environment around the testes element (a search box) evolved, and thereby the element itself performed better being visualized differently.

And while looking back at my own set of AB tests I ran throughout my career, I also rejected ideas of retesting because…well, we already proved the hypothesis to be wrong or right.

So, whenever you get presented with a testing idea which ran 6+ months in the past, look again at your product and check the situation:
Did you make changes to the main navigation in the meantime? Did you test within a particular segment or across the user base? Does it maybe make sense to target the change at a particular user group? What if the recently introduced new pricing tier would impact the test again?

Try to find a balance between reducing waste and embracing retesting of known hypotheses. You might be surprised regarding the impact you can generate.

How amazon prime does Churn Prevention

I looked at the cancellation process of the amazon prime membership to find out how amazon tries to prevent its prime members from churning.

You can find the complete transcript of my teardown right below.

  1. Intro Slide
  2. So, I want to cancel my amazon prime membership. Where to start?
  3. Maybe over here?
  4. It’s an ‘Account-ish’ thing, I guess?
  5. Ah, right in the first row of options. Very handy.
  6. So much prime stuff. Bet they’ve hidden the cancelation option as deeply as possible…
  7. Oh, look. Quite accessible and neatly organizes with other account related settings. Nice.
  8. I love progress indicators. They’re great for setting user expectations at the beginning of a process (honestly, I’d have expected a longe one here).
  9. Ok, obviously bad timing for this teardown. Prime Day is around the corner and obviously a huge counter argument.
  10. But what do I see here? Three equally weighted buttons? Very un-amazonian. Let’s go through them.
  11. A reminder? That sounds fair and handy. Let’s check out this option
  12. Back where I started with a subtle but comprehensive confirmation. Great option as I can’t drop the membership much earlier anyways.
  13. In general, meanwhile ‘standard’ practice for cancelation copy applied right here: Speaking in (losing) benefits. This one just lets me drop out of the funnel.
  14. Ok, let’s say I don’t want to keep those amazing benefits…
  15. Interesting that they’re aiming this counter argument solely towards pricing. They didn’t ask for any reasons before?
  16. Interesting that they’re aiming this counter argument solely towards pricing. They didn’t ask for any reasons before? Probably they learned over time that this is the only main reason for people to cancel prime. Therefore, it makes perfect sense.
  17. Interesting that they’re aiming this counter argument solely towards pricing. They didn’t ask for any reasons before? Probably they learned over time that this is the only main reason for people to cancel prime. Therefore, it makes perfect sense. I also really like that they don’t work with discounts to target ‘my’ pricing-related cancelation reason. It would depreciate the value of the membership
  18. I still get the known options to receive a reminder or drop out of the funnel.
  19. So, is this already the final button to end the membership? Reads like it, but the progress indicator shows one more step…
  20. Ah, of course not. This is the final final call. Neat overview of the relevant facts (end date) and the known options.
  21. For some reason, the buttons are still equally weighted. Not because of ‘dark patterns’, but just more better user orientation.
  22. Clear and obvious confirmation of my cancelation and when I’ll lose access to my prime benefits.
  23. I like the (maybe too) subtle option to directly reverse my decision. Not screaming for attention but perfectly integrated.

My Recap of the Working Products 2017 Conference

While Germany overall might be a bit short on high-quality product management conferences, Hamburg is certainly not.
Not that long after the great MTP Engage conference took place, the Working Products started into their second edition, gathering passionate product and design people for a two great day exchange of thoughts and knowledge.
The main difference for me personally was that I also delivered a talk about how and when MVPs are too expensive in the context of validation (more on that in a later post).

Format & Organization

The event was held in the facilities of eparo, a UX consulting agency which is also the organizer of the format. Rolf and his team did a great job leveraging the available space for the conference format.

What’s interesting about the Working Products was this year’s organizational schedule: While the ‘formal’ talks were limited to the mornings, the afternoons dedicated to the interactive and collaborative ProductSpace sessions.
This distinction was a welcomed change from traditional conference formats, and it was Alsop attractive enough to keep most of the speakers around for time beyond their keynotes.

As all talks were held in parallel, one always had to choose between Track 1 or 2. Thankfully, the choice which talks to attend wasn’t a limited one, as all presentations were also recorded.
Here are some shared impressions and thoughts from the ones I attended.

Matthias Schrader

The opening keynote was held my Matthias Schrader, CEO of the agency SinnerSchrader. While he was a bit late due to a calendar mix-up, he delivered an interesting talk based on his recent book ‘Transformationale Produkte‘.
As I haven’t read the book yet, it was nice to get a glimpse into its content and visualization. And while I’m probably not the target group for the book, I would highly recommend it to companies facing challenges due to digital transformation as it lays out the underlying mechanics instead of stopping after the glossy surface. It might spare you the next trip to Silicon Valley.

Working Products 2017 - Matthias Schrader



Heide Peuckert

Heide is Head of Product at Nijuko, a development agency based in Hamburg. I previously interviewed her at my German podcast Productish about what it takes to be a PO in a contractor environment.
She gave some extended insights into her daily challenges and struggles and pointed some key pain points out when it comes to working with clients – Some examples were the lack of overall product vision or missing commitment to invest time and money in design, leave alone user research.

Working Products 2017 - Heide Peuckert

Tim Rudolph

Tim was the first real ‘corporate’ guy speaking at the event. He’s leading the digital lab of the logistics company Hermes. And while he couldn’t share that many details regarding the actual output the unit was providing, it was fun to see how they structure their division and which challenges they faced when trying to bring ideas out to the ‘real’ world.
Particularly in the real world business of delivering packages, in which you can’t easily roll an AB test back in case of a missing legal requirement.

Roman Pichler

Roman Pichler is probably known to the majority of German product people out there. Especially for me, he’s kind of a role model who profoundly influenced my appreciation for agile product ownership through his writing and shared thoughts.
So it was evident that I had to attend the talk he gave to open day 2 of the conference. His talk was heavily based on his recent book ‘Strategize‘ which is part of my constantly growing library of product books.

If you’ve read the book, the talk didn’t add much new stuff to it, but it was impressive to experience Roman live and in action.

Working Products 2017 - Roman Pichler

Florian Grote

The last talk I attended was from Florian who shared incredible transformational insights from the music hard- and software company Native Instruments.
While I’m not a huge musician myself, I could imagine the challenges setting up discovery and delivery tracks for such complex, yet intertwined products.

Working Products 2017 - Florian Grote

Summary of the Working Products 2017

My positive impressions of the Working Products 2016 was only confirmed by my attendance of 2017. I love the theoretical and practical balance of the format and have to give a huge compliment to the catering and overall organization.
Despite feeling like a small XING alumni meetup, complemented by some ‘other people,’ I had great chats, and it was an incredible pleasure to finally getting back into public speaking. If you have an event to recommend for giving a product-related talk, feel free to suggest one or refer me.
(Almost) all slides are already available on the homepage, and the video recordings should follow in a couple of days. I curious about the lineup and location of the Working Products 2018 – See you there!

Alexa, what’s the definition of a Product Discovery?

As most of you know, the best way to understand a new technology is to work with it, instead of only using it. This was true for me during my personal ‘peak mobile’ phase, when I build an iOS app myself and has been the trigger for all of my side projects.

So while the Amazon Echo failed to cover a personal use case in my home, I remained curious whether I may have missed something which would let me acknowledge the real potential of the platform.

This is why I immediately jumped on the idea of building my very own Alexa Skill once I saw the excellent team at treehouse offering a course for it.
It didn’t take long for me to adapt the underlying idea of the course to a Skill which would work for my domain of expertise.
So I went ahead and built the Product Management Dictionary Skillfor the Amazon Echo and other Alexa-enabled devices. Go ahead and give it a try

So far, I kept the scope of this initial release very straightforward and only included definitions for three product management terms and focussed on English as a system language (this is why it may not be possible to enable the Skill on your German Echo).

I’d love to hear your thoughts on this kind of adoption for Amazon Echo capabilities. Even though the primary job of this Skill was to let me dive deeper into voice products and the Alexa developer environment, I’d love to iterate this product into something serving broader needs within the product community.

If you want to suggest another term be included in the Skill’s dictionary, just go ahead and add it to this Spreadsheet.