All Chapters

The Missing Chapter

The Availability Heuristic

Why we fear sharks more than vending machines — and what that costs us

An extension of Jordan Ellenberg's "How Not to Be Wrong"

Chapter 76

The Shark and the Vending Machine

About five people per year are killed by sharks in the United States. About thirteen are killed by vending machines. Now: which one keeps you up at night?

If you're like most humans — and I mean this as a compliment, mostly — you spend vastly more mental energy worrying about shark attacks than about the rogue Snapple dispenser in your office break room. This is not because you're stupid. It's because you have a brain, and brains come with factory-installed software that was last updated during the Pleistocene.

The bug in the software has a name. In 1973, two psychologists named Amos Tversky and Daniel Kahneman identified what they called the availability heuristic: when we need to estimate how frequent or probable something is, we don't do the sensible thing and look up the base rates. We ask ourselves a different question entirely — how easily can I think of an example?1

And this, it turns out, is like judging the quality of a restaurant by how bright its neon sign is. Sometimes the correlation holds. Often it doesn't. And when it fails, it fails in ways that distort our politics, our medicine, our personal decisions, and our understanding of what the world is actually like.

· · ·

The Letter K Problem

Here's the experiment that started it all. Tversky and Kahneman asked people a simple question: Are there more English words that start with the letter K, or more words that have K as the third letter?

Most people said "start with K." It's not even close, they figured — king, kite, kitchen, kangaroo. Easy. The words just flow.

But the answer is the opposite. Words with K as the third letter — ask, ink, acknowledge, lake, make, take, bike — outnumber K-starters by roughly three to one.2 The only reason people get this wrong is that our mental filing system is organized like a dictionary: by first letter. Retrieving words that start with K is trivially easy. Retrieving words with K in the third position requires a kind of cross-referencing that our brains are simply not built to do quickly.

So when Tversky and Kahneman asked "which is more common?" people didn't answer that question. They answered a substitute question: "which is easier to recall?" And they didn't even notice they'd made the swap.

The Availability Substitution Question asked: "How frequent is X?" swap! Question answered: "How easily can I recall X?" These diverge when: • Media coverage ≠ actual frequency • Vivid/emotional events dominate memory • Personal experience is unrepresentative
Your brain silently replaces a hard statistical question with an easy memory-retrieval question.

This is the core mechanism of the availability heuristic. It's a substitution: replace a hard question (what's the actual probability?) with an easy one (how quickly does an example come to mind?). The substitution is invisible. You don't feel yourself doing it. You just feel confident.

· · ·
Chapter 76

Why Your Brain Works This Way (And Why It Used to Be Fine)

Before you get too angry at your brain, consider: the availability heuristic isn't a bug. It's a feature that shipped in version 1.0 of Homo sapiens and worked beautifully for a hundred thousand years.

In ancestral environments, the correlation between "easy to remember" and "actually common" was very strong. If you could vividly recall a lion attack at the watering hole, that was excellent evidence that lions frequent the watering hole. Your sample was your actual environment. The availability heuristic was a brilliant, energy-efficient shortcut — why waste cognitive resources counting when you could just check how fluently examples came to mind?3

The problem is that we no longer live in a world where our personal experience is a representative sample of reality. We live in a world with mass media.

A plane crash that kills 200 people generates weeks of front-page coverage. Car accidents that kill 200 people generate — well, nothing, because that happens every two days in America, and "thing that happens every two days" is not a story. The result: people dramatically overestimate the risk of flying and underestimate the risk of driving. Some people, frightened by coverage of a plane crash, choose to drive instead of fly, which actually increases their risk of dying.4

The media doesn't just report reality — it curates reality for maximum salience. Rare, vivid, dramatic events make good stories. Common, slow-moving, statistical killers do not. And so our availability-based mental model of the world — the model that used to map pretty well onto actual risks — becomes a funhouse mirror.

Test Your Own Risk Perception

How well calibrated is your sense of what kills people? Let's find out.

🎯 Risk Perception Quiz

For each pair, guess which cause of death kills more Americans per year.

Question 1 of 8

· · ·
Chapter 76

The Arithmetic of Dread

If the availability heuristic only made us nervous at the beach, it would be a curiosity — something for psychologists to write papers about and the rest of us to chuckle at during cocktail parties. But it reaches much further than that.

Consider how the United States allocates resources for public safety. In the years after September 11, 2001, the federal government spent roughly $150 billion per year on counterterrorism.5 During the same period, terrorism killed an average of fewer than 20 Americans per year on U.S. soil. Heart disease, meanwhile, killed about 650,000 Americans per year and received about $2 billion in federal research funding.

Let me put that differently. Per death prevented (or potentially prevented), counterterrorism spending outweighed heart disease spending by a factor of roughly 25,000 to one.

Spending vs. Deaths: The Availability Gap Annual U.S. Deaths Heart Disease 650,000 Car Accidents 38,000 Terrorism <20 Fed. Spending ($/death) Heart Disease ~$3K Terrorism ~$7.5M The thing that scares us most gets the most money — not the thing that kills us most.
Federal spending per death is inversely correlated with how "available" the threat feels.

Now, I'm not saying we shouldn't spend money on counterterrorism. Terrorism has political and social effects beyond the death count, and prevention is genuinely hard to account for. But the magnitude of the disparity tells a story that has little to do with rational cost-benefit analysis and a lot to do with what fills our television screens. Terrorism is available. Heart disease is not.

The Availability Principle

The more vivid, emotional, and recent an event is, the more "available" it becomes — and the more we overestimate its frequency. Meanwhile, slow, statistical, undramatic killers become invisible precisely because they're so common.

Doctors Are Not Immune

You might think that trained professionals would be better at this. They are not. Studies have shown that physicians are systematically biased toward diagnosing conditions they've seen recently. A doctor who just treated a rare case of lupus is measurably more likely to diagnose the next ambiguous case as lupus too — not because the evidence points there, but because lupus is, at that moment, highly available.6

This is the thing about the availability heuristic that makes it so insidious: it doesn't feel like a bias. It feels like expertise. The doctor isn't thinking "I'm overfitting to my recent experience." The doctor is thinking "My clinical judgment tells me this is lupus." The availability heuristic doesn't announce itself. It wears the costume of intuition.

· · ·
Chapter 76

Availability Cascades

In the 1990s, legal scholar Cass Sunstein and economist Timur Kuran identified something even more alarming than individual availability bias: they called it an availability cascade.7

Here's how it works. Some event occurs — a chemical spill, a school shooting, a new disease. Media covers it. Because the coverage makes the event available, people become concerned. Because people are concerned, more media covers it. Because more media covers it, the event becomes even more available. Public concern rises further. Politicians respond. More coverage follows.

An availability cascade is a self-reinforcing feedback loop in which media coverage creates fear, and fear creates more media coverage, until public policy has completely decoupled from actual risk.

The Alar apple scare of 1989 is a textbook case. Alar was a chemical sprayed on apples to regulate growth. A report suggested it might be carcinogenic. Media coverage exploded. Parents panicked. Schools banned apples. The EPA acted. But the actual risk? A panel of scientists later estimated that a child would need to drink about 19,000 quarts of apple juice per day to reach the danger threshold. The risk was vanishingly small — but the cascade was self-sustaining.

Availability cascades are the availability heuristic scaled up from an individual cognitive quirk to a collective social phenomenon. They explain why certain risks dominate public discourse wildly out of proportion to their actual danger, while genuinely massive threats — antibiotic resistance, soil degradation, cardiovascular disease — languish in obscurity because they are boring.

See It For Yourself

The interactive below demonstrates how exposure shapes your frequency estimates — even when you know you're being manipulated.

🧠 Availability Bias Demonstrator

You'll see a rapid sequence of words from four categories. Then estimate how many of each you saw. Ready?

· · ·
Chapter 76

The Outside View

So what do we do? If the availability heuristic is baked into our cognitive firmware, are we simply doomed to misjudge risk forever?

Not necessarily. Kahneman himself suggested the antidote, borrowing a term from forecasting: the outside view.8

The inside view says: "What does my gut tell me? What examples come to mind? How does this feel?" The outside view says: "What do the base rates say? What's the actual statistical frequency? What does the reference class look like?"

The inside view tells you flying is dangerous because you can picture the plane crash. The outside view tells you that in 2023, there were zero fatal commercial airline crashes in the United States, and your odds of dying on any given flight are roughly one in eleven million. The inside view tells you your neighborhood is getting more dangerous because you saw a crime story on the local news. The outside view tells you that violent crime in America has declined by roughly 50% since the early 1990s.

Adopting the outside view doesn't mean ignoring your instincts entirely. Instincts are sometimes useful data. But it means checking those instincts against the numbers — treating your vivid, emotionally charged first impression as a hypothesis rather than a conclusion.

Inside View vs. Outside View Inside View "Can I picture it?" "How scary does it feel?" "Did I see it on the news?" salience → probability Prone to bias Outside View "What's the base rate?" "What do the data say?" "What's the reference class?" frequency → probability Calibrated
The cure for availability bias: check your vivid intuitions against boring base rates.

Here's the real lesson of the availability heuristic, and it's a lesson that goes far beyond sharks and vending machines. The world is not as your brain presents it to you. Your mind constructs a model of reality, and that model is systematically skewed toward the vivid, the recent, the emotionally charged. The things you worry about most are often not the things most likely to hurt you. The things most likely to hurt you are the ones so common, so mundane, so boring that they never cross your mind at all.

The availability heuristic made us excellent survivors on the savannah. In the age of 24-hour news and algorithmic feeds, it makes us something closer to paranoid, misallocating our fear and our resources in ways that are both personally irrational and collectively dangerous.

The fix is not to stop feeling. The fix is to know when your feelings are doing your math for you — and to do the actual math instead.

After all, the vending machine doesn't care whether you find it scary.

Notes & References

  1. Tversky, A., & Kahneman, D. (1973). "Availability: A Heuristic for Judging Frequency and Probability." Cognitive Psychology, 5(2), 207–232.
  2. The original K-letter experiment appears in Tversky & Kahneman (1973). Subsequent replications have confirmed the effect across multiple languages and letters.
  3. Gigerenzer, G. (2007). Gut Feelings: The Intelligence of the Unconscious. Viking. Gigerenzer argues that heuristics like availability are "ecologically rational" — well-adapted to the environments in which they evolved.
  4. Gigerenzer, G. (2004). "Dread Risk, September 11, and Fatal Traffic Accidents." Psychological Science, 15(4), 286–287. Gigerenzer estimated that the increase in driving after 9/11 led to approximately 1,595 excess traffic fatalities in the 12 months following the attacks.
  5. Mueller, J., & Stewart, M.G. (2011). Terror, Security, and Money: Balancing the Risks, Benefits, and Costs of Homeland Security. Oxford University Press.
  6. Mamede, S., van Gog, T., van den Berge, K., et al. (2010). "Effect of Availability Bias and Reflective Reasoning on Diagnostic Accuracy Among Internal Medicine Residents." JAMA, 304(11), 1198–1203.
  7. Kuran, T., & Sunstein, C.R. (1999). "Availability Cascades and Risk Regulation." Stanford Law Review, 51(4), 683–768.
  8. Kahneman, D., & Lovallo, D. (1993). "Timid Choices and Bold Forecasts: A Cognitive Perspective on Risk Taking." Management Science, 39(1), 17–31.