quality social links
inppprkvm wvspg hdfbrsi hhjo tomopsybktazqfl
Humans are not particularly rational decision-making beings. Left to our own devices, we’re susceptible to a wide variety of mistaken beliefs, flawed heuristics, and simple logical fallacies.
We acknowledge and address these natural susceptibilities when it comes to our customers’ behavior. But let’s hold a mirror up to ourselves! Our work as growth marketers and product managers is also vulnerable to cognitive bias, no matter how “data-driven” we say we are.
With an awareness of how our brains tend to trick and trap us, we can find a path to clearer thinking and better decision-making. Here are three specific cognitive biases to keep a careful eye out for:
Automation bias is when you tend to believe what your tools and systems tell you must be correct, even despite contrary evidence. This bias is tough to escape, as increasingly automation infuses everyday life, from getting the day’s weather to driving a car to automating rote tasks at work.
Automation lulls us into a sense of false security and lack of questioning. When it comes to problem-solving in areas like growth and product, we might find ourselves lulled into following “best practices” and advice on autopilot, just like how we follow every turn-by-turn direction given by GPS navigation.
We want to be successful, and in the pursuit of that success, we try to emulate what was done in the success stories we hear. In other words, we go on what Andrew Chen calls “advice autopilot”:
The problem is, the best advice rarely comes in this kind of format – instead, the advice will start out with “it depends…” and takes into account an infinite array of contextual and situational things that aren’t obvious. However, we are all lazy and so instead we go on autopilot, and do, read, say, and build, all the same things.
Advice autopilot is often well-intentioned and can still be helpful, revealing good insights to the uninitiated, but you risk taking a shortcut to building a well-intentioned but doomed business.
Everyone who’s had some measure of success did so by navigating deeply “contextual and situational” problems in a specific way. You’re not going to emulate their success by putting yourself on autopilot. Your set and sequence of growth navigation directions come from your own map.
First described by psychologists Amos Tversky and Daniel Kahneman, the illusion of validity is a cognitive bias that causes you to overestimate the accuracy and soundness of your judgment.
When Kahneman was 21 years old serving in the Israeli army, he helped run tests designed to evaluate and predict the best candidates for officer training. Kahneman and his colleagues would watch exercises and take copious notes on the soldiers’ behavior, their group dynamics, and their individual traits like aggression and patience. After each exercise, they evaluated each soldier and his readiness for officer training. “Under the stress of the event, we felt, each man’s true nature revealed itself in sharp relief,” Kahneman said. For a soldier who led the team during the exercise, for example, “the obvious best guess about how he would do in training, or in combat, was that he would be as effective.”
These best guesses turned out to be “largely useless.” Through each batch of new candidates, their predictions failed, found to be only slightly better than random guesses. The “statistical evidence of [their] failure” didn’t dissuade them or cause changes in how they made their predictions—in fact, Kahneman notes incredulously, they continued on with a deep conviction that their predictions were good.
Persistence is much lauded when it comes to building and growing a business, of course. But persisting in one’s approaches to problems and situations despite what the evidence tells us is not the way to win.
“Eventually, all playbooks will stop working,” Brian Balfour says, “The real problem is that over time we become habitualized and attached to these playbooks, and we resist throwing them away. ”No growth tactic, no acquisition channel, no retention technique will always work and be so good that you can cross that area of improvement off your to-do list. Things change, situations shift, what looks valid in one moment doesn’t apply to the next, and sometimes you’re just plain wrong.
As Kahneman concluded, “True intuitive expertise is learned from prolonged experience with good feedback on mistakes.” Continually ask smart questions, get and learn from feedback and metrics, modify approaches based on results, experiment, and start over, even when you feel confident.
Survivorship bias is the over-privileging of evident successes — whatever “survived” and made it through a process. Survivorship bias narrows your field of vision, so that you’re making decisions based on incomplete information, discounting or ignoring key evidence from the failures and cases that don’t make it through.
A commonly cited example comes from World War II. Beset by an increasing number of planes being shot down, the Center for Naval Analyses set out to analyze where on the bodies of planes extra armor should be added. They diagrammed the bullet and shrapnel holes on every plane that came back from war, resulting in the following average distribution:
Fortunately, thanks to statistician Abraham Wald, extra armor wasn’t added to the spots marked in red. Instead, Wald recommended that armor be added where they observed fewer bullet holes, because the planes hit there were the ones that never made it back to be analyzed. All the planes incurring damage to their wingtips and tails had been able to fly back to be diagrammed in the first place. The study failed to consider vital information and evidence.
Survivorship bias is a classic problem in business and investment. Looking only at prior successes in a particular vertical can make a certain bet seem good to a VC. Reading only about past successes can lead to incorrect conclusions about how first-time entrepreneurs should think about building their businesses. These are good high-level examples—in the trench of making product, though, survivorship bias is even more prevalent.
Your existing users are survivors too. When you go to learn about how your product is working from them, you always have to weigh the fact that they made it through. As John Egan, Growth Engineer at Pinterest, puts it:
Every active user you have today has figured out how to use the product and is getting enough value to continue to use it. Everyone else that didn’t get it has probably already churned out.
John shares one example from his time at Shopkick, when his growth team thought an experiment to encourage users to visit partner stores fell flat, achieving just a 2-3% lift, way below expectations. But when they segmented the analysis by existing and new users, they discovered that new users had more store visits by 30%. The experiment became a specific strategy for effectively activating new users.
Take people who tried your product briefly into account too — especially when it comes to activation and onboarding. They might know better than anyone why something is or isn’t working.
Cognitive biases exist because the brain needs to be able to take mental shortcuts and reroute focus and attention in other ways. The problem is when we blindly let our biases lead us, because it’s the path of least resistance. Clearer thinking takes effort, requiring deliberate and intentional thought processes that question paths taken, respond to feedback, and take specific context into account.
From Buster Benson’s helpful resource organizing 175 cognitive biases
quality social links
inppprkvm wvspg hdfbrsi hhjo tomopsybktazqfl
Comments are closed.