The Rule Behind the Rule
In 1960, the British psychologist Peter Wason invented what might be the cruelest experiment in the history of cognitive science. Not because it hurt anyone physically — just their pride.
Here's what he did. He told his subjects: "I have a rule in mind that classifies sequences of three numbers. The sequence 2, 4, 6 satisfies this rule. Your job is to discover the rule by proposing other sequences of three numbers. For each one, I'll tell you whether it satisfies the rule or not. When you think you know the rule, tell me."1
Stop for a moment and think about what you'd do. You see 2, 4, 6, and your brain — beautiful, pattern-hungry organ that it is — immediately forms a hypothesis. Even numbers going up by two. So you test it: 8, 10, 12? Yes. 20, 22, 24? Yes. 100, 102, 104? Yes, yes, yes.
At this point, most of Wason's subjects announced their rule with supreme confidence. "Even numbers, ascending by two."
Wrong.
The actual rule was: any three numbers in ascending order. That's it. 1, 2, 3 works. So does 5, 99, 8000. So does π, 47, a million. The subjects' rule was a tiny subset of the real one, like mistaking your backyard for the planet Earth.
But here's the devastating part: they never found out, because they never tried a sequence that would disprove their hypothesis. Nobody tested 1, 3, 5. Nobody tested 9, 2, 7 (which would have been rejected — it's not ascending — and would have told them something genuinely useful). They only tested sequences that their theory predicted would work. And every "yes" reinforced their confidence in a theory that was, at best, incomplete and, at worst, completely wrong.2
Try it yourself:
🧪 Wason's 2-4-6 Task
The sequence 2, 4, 6 follows my rule. Propose three numbers to test. When you think you know the rule, guess it.
Think you know the rule?
If you're like most people, you tested a bunch of sequences that confirmed your initial hunch — and every "yes" made you more sure. That's confirmation bias in its purest form: the tendency to search for, interpret, and remember information that confirms what you already believe.3
The Asymmetry of Evidence
There's a logical reason why Wason's task is so hard, and it goes back to one of the deepest ideas in the philosophy of science. Karl Popper, the Austrian-British philosopher, put it this way: you can never prove a universal theory, but you can disprove one in a single stroke.4
A thousand white swans cannot prove that all swans are white. But one black swan can prove that they are not.
This is the asymmetry of evidence, and it's not just a philosopher's curiosity — it's a mathematical fact. Consider two hypotheses about our swan population:
Seeing a white swan is consistent with both "all swans are white" and "most swans are white." It barely moves the needle between them.
When you see a white swan, that evidence is consistent with both hypotheses. Under H₁ (all swans are white), you'd definitely see a white swan. Under H₂ (most swans are white, some aren't), you'd also very likely see a white swan. So a white swan doesn't help you distinguish between them. Formally:
But a black swan? That's a different story entirely. Under H₁, the probability of seeing a black swan is exactly zero. Under H₂, it's small but possible. One black swan annihilates H₁. That's the Popperian insight: disconfirming evidence is disproportionately powerful.
And yet — and this is the heartbreaking part — our brains are wired to seek the white swans. We Google our symptoms looking for the diagnosis we've already made. We read the news sources that confirm our political priors. We ask questions that can only tell us what we already think we know.
Bayes Meets Bias
Let's make this precise with Bayes' theorem. Suppose you think there's a 60% chance that all swans are white (H₁) and a 40% chance that only 90% of swans are white (H₂). Now you see a white swan. How should your beliefs update?
Under H₁, a white swan has probability 1.0. Under H₂, it has probability 0.9. Running the numbers: P(H₁|white) = (1.0 × 0.6) / (1.0 × 0.6 + 0.9 × 0.4) = 0.6 / 0.96 ≈ 0.625. Your confidence in H₁ nudges from 60% to 62.5%. A whisper of evidence.
But now see a black swan. Under H₁, probability is 0. Under H₂, it's 0.1. The posterior: P(H₁|black) = (0 × 0.6) / (0 × 0.6 + 0.1 × 0.4) = 0. Done. H₁ is dead. One observation. No appeal.
Watch this asymmetry in action:
📊 Evidence Asymmetry Visualizer
Hypothesis: All swans are white. Prior belief: 60%. Each confirming observation (white swan) nudges it up. One disconfirming observation (black swan) destroys it.
That's the mathematics behind Popper's intuition. Confirming evidence is cheap. Disconfirming evidence is priceless. And our built-in cognitive machinery systematically avoids the priceless kind.
Scientists Are People Too
You might think science is the antidote. After all, Popper's whole framework — falsificationism — says that science advances not by proving theories right, but by trying to prove them wrong. A theory that can't be falsified isn't science at all.
That's the ideal. The reality, as Thomas Kuhn documented in The Structure of Scientific Revolutions, is considerably messier.5 Scientists are humans who've spent years developing theories, building careers around them, sometimes staking their identities on them. When anomalous data appears — data that doesn't fit the paradigm — the first response is rarely "let me abandon my life's work." It's more like "there must be a measurement error."
When Barry Marshall proposed in 1982 that stomach ulcers were caused by the bacterium H. pylori rather than stress and diet, the medical establishment's response was essentially: "That's insane." Not because the evidence was weak — Marshall had cultured the bacteria, shown the correlation, even infected himself and developed gastritis — but because "everyone knew" ulcers were caused by lifestyle factors. It took over a decade and a Nobel Prize for the paradigm to shift.6
Kuhn called this paradigm resistance: the tendency of scientific communities to protect established theories by explaining away anomalies rather than confronting them. It's confirmation bias wearing a lab coat.
The self-reinforcing cycle: belief drives the search for evidence, which reinforces the belief, which drives more selective search.
The Likelihood Trap
Here's a subtlety that even smart people miss. People routinely confuse two very different probabilities:
This is the prosecutor's fallacy, and it's confirmation bias wearing a suit. A prosecutor might argue: "The probability that an innocent person's DNA matches the crime scene is one in a million. The defendant's DNA matches. Therefore, the probability of innocence is one in a million." But that ignores the base rate — in a city of 10 million people, roughly 10 people share that DNA profile. The match is strong evidence, but not overwhelming.7
The same confusion poisons everyday reasoning. Your horoscope says "you will face a challenge today." You face a challenge. (Who doesn't?) P(challenge | horoscope is real) is high — but P(challenge | horoscope is nonsense) is also high. The evidence is common under both hypotheses, so it tells you nothing. But it feels like confirmation, and that feeling is what matters to the brain.
Where the Damage Is Done
If confirmation bias were just a quirk that made parlor games tricky, we could live with it. But the damage extends into every domain where humans make consequential judgments.
Criminal Investigation
Tunnel vision is what happens when detectives form an early suspect and then — consciously or not — seek only evidence pointing to that suspect. They remember the eyewitness who said "it was a tall man" and forget the one who said "I think it was a woman." The Innocence Project has documented hundreds of wrongful convictions driven in part by this dynamic: once the machinery of prosecution locks onto a target, disconfirming evidence becomes invisible.8
Medical Diagnosis
Doctors call it anchoring. A patient presents with chest pain, and the first doctor to see them writes "possible cardiac event" in the chart. Every subsequent doctor reads that note and filters the symptoms through that lens. The abdominal tenderness that might have suggested pancreatitis gets downweighted. The diagnosis becomes a self-fulfilling prophecy, not because the evidence supports it, but because the hypothesis came first.
Political Beliefs
The machinery of selective exposure has never been more powerful. Algorithms learn what you click on and serve you more of the same. If you believe immigration is dangerous, you'll see stories about immigrant crime. If you believe it's enriching, you'll see success stories. Both are real — but neither represents the full picture. The result is that people with opposing views can look at the same world and see completely different realities, each convinced the evidence is on their side.
Investing
Here's a reliable way to lose money: buy a stock, then go looking for reasons it will go up. You'll find them. There are always bull cases and bear cases for every position. The confirmation-biased investor reads the bull case for stocks they own and the bear case for stocks they don't. Their portfolio becomes a monument to motivated reasoning.
The Remedies
If the disease is a refusal to seek disconfirmation, the cure is building structures that force it. The last century of scientific methodology is, in a sense, a long war against our own cognitive tendencies.
Four Debiasing Strategies
Pre-registration: Before running an experiment, scientists publish their hypothesis and analysis plan. This prevents the all-too-human practice of running the experiment, looking at the data, and then "hypothesizing" whatever the data shows. Pre-registration doesn't eliminate bias, but it makes it visible.
Adversarial collaboration: Two scientists who disagree design an experiment together, agreeing in advance what result would change each person's mind. The psychologist Daniel Kahneman championed this approach precisely because he understood that smart people are the most creative at finding reasons to dismiss inconvenient evidence.
Red teams: Organizations appoint a group whose explicit job is to argue against the prevailing view. Intelligence agencies use them. So do some investment firms. The key is that the red team must be genuinely empowered — not a token exercise in pseudo-disagreement.
"Consider the opposite": The simplest debiasing technique is also one of the most effective. Before committing to a judgment, ask yourself: "What would have to be true for the opposite conclusion to hold?" Charles Darwin practiced this: he kept a special notebook for facts that contradicted his theories, because he noticed he was prone to forgetting them.
One disconfirming observation outweighs a hundred confirming ones. The scale of evidence is radically asymmetric.
The Hard Part
The hardest thing about confirmation bias isn't understanding it. You understand it now. The hardest thing is that knowing about it barely helps. Wason's subjects were university students — smart, educated people. They still fell for it. Knowing the name of the trap doesn't spring you from it, any more than knowing the word "gravity" lets you fly.
What does help is changing the structures around your thinking. Don't rely on willpower to "be objective." Instead, build systems: write down your prediction before you see the data. Seek out the smartest person who disagrees with you. Ask not "is there evidence for my view?" but "what evidence would make me change my mind?"
And when you find yourself Googling "evidence that [thing I already believe]" — stop. Google the opposite instead. You'll be amazed at what you find. Or horrified. Either way, you'll be better calibrated, which in a world drowning in information and starving for wisdom, is about the most valuable thing you can be.
The mathematician in me wants to end with a formula. The humanist in me wants to end with an aphorism. So here's both: P(you're wrong | you've never looked for evidence you're wrong) is uncomfortably high. And as the old saying goes, it ain't what you don't know that gets you into trouble — it's what you know for sure that just ain't so.