Data traps every PM should avoid
Good data analysis requires recognizing common pitfalls that lead to wrong conclusions. These traps catch even experienced product managers who let excitement or pressure override careful thinking:
- Survivorship bias hides failures from view. You analyze successful users without considering those who left. This makes your product seem better than it is and blinds you to serious problems. Always ask who's missing from your data.
- Correlation confusion strikes when two things happen together. Suppose mobile app usage goes up at the same time as push notifications increase. It might look like the notifications caused the growth, but both could actually be linked to a holiday season campaign. Correlation alone cannot prove cause. Always test whether one factor truly drives the other. Look for plausible mechanisms that explain relationships. Test your assumptions with experiments.
- Sample size seduction happens when random variation looks like a pattern. A feature that shows 50% improvement with 10 users might show no difference with 1,000. Wait for sufficient data before declaring victory. Small samples lie more often than large ones.
- Data selection bias happens when you only use facts that support what you want to believe. Decide on success metrics before looking at results. Document all analyses, not just the ones that support your hypothesis.