Courses

Course progress0%

Defining Success: Metrics That Matter

The Feature That "Shipped" But Didn't Work

There's a particular kind of failure common in product teams: you build a feature, you ship it, you announce it — and nobody ever asks if it actually worked. Features get counted as successes the moment they launch.

Success is not shipping. Success is changing a user behavior or a business outcome that you targeted before you started building.

The Metrics Hierarchy

North Star Metric The single number that best captures the core value your product delivers to users. It should correlate strongly with long-term business health.

Examples: - Spotify: Monthly active listeners - Airbnb: Nights booked - Slack: Daily active users who send at least one message - A PM learning platform: Lessons completed per active learner per week

Input Metrics (Leading Indicators) Metrics you can directly influence that predict movement in the North Star. If your North Star is "lessons completed," inputs might be: lesson completion rate per session, time between sessions, module 1 completion rate.

Output Metrics (Lagging Indicators) Business outcomes that result from good inputs. Revenue, retention, NPS. Real outcomes, but slow to move and hard to directly influence in a single sprint.

The HEART Framework

Google's framework for measuring UX quality across five dimensions:

  • Happiness: How satisfied are users? (Surveys, NPS, app ratings)
  • Engagement: How often and deeply are users interacting? (Sessions, actions per session, DAU/MAU)
  • Adoption: Are new users taking up new features? (Feature activation rate, time-to-first-action)
  • Retention: Are users coming back? (D7/D30 retention, churn rate)
  • Task Success: Are users completing what they set out to do? (Completion rates, error rates)

Vanity Metrics vs. Actionable Metrics

Vanity metrics make you feel good but can't drive a decision. Here are the most common swaps:

  • Total registered users → Weekly active users
  • Total page views → Pages per session for converting users
  • Features shipped → Feature adoption rate
  • App downloads → Day-7 retention after download
  • Emails sent → Click-through → activation rate

The test for a good metric: Can you make a decision based on this number? If it goes up, do you know what to do? If it goes down, do you know where to look?

Setting Targets Before You Build

One of the most powerful PM habits: set a success metric and a target *before* the feature ships. Not after.

Before any significant initiative, write down: - What metric will move if this works? - What's the baseline today? - What's the target in 30 / 60 / 90 days? - What's the threshold that would tell us to change course?

This habit builds PM credibility faster than almost anything else. It demonstrates that you're outcomes-focused, not just output-focused.

Key Takeaway: North Star metrics capture the core value you deliver. Leading indicators tell you if you're on track. Lagging indicators tell you the business result. Define your success metric and your target *before* you build — then track it honestly after.