In 1943, Abraham Wald looked at bullet-hole data from returning bombers and told the military to armor the parts with no holes. The holes showed where a plane could be hit and survive. The empty spots? That's where the planes that didn't come back were hit.
Chapter 1Abraham Wald's Bombers
In the summer of 1943, at 401 West 118th Street in Manhattan, the Statistical Research Group was trying to win a war with pencils. Milton Friedman was there. Jacob Wolfowitz. And among them, quiet and precise, was Abraham Wald — a Hungarian Jewish mathematician who had fled Vienna after the Anschluss.1
The military had a problem. American bombers were returning from Europe riddled with bullet holes. Armor is heavy — you can't armor everything. So they mapped where the holes were: heavy damage on the fuselage, fuel system, and wings. Engines were relatively unscathed. The conclusion seemed obvious: reinforce the fuselage and wings.
They brought the data to Wald. He said: Armor the engines.
Wald's insight was devastatingly simple. The holes prove that a plane can take damage there and still fly. The engines don't have holes because planes hit in the engines don't come back. The data you have is shaped by the data you don't have — the planes that crashed in German territory, their crews dead, their stories untold.2
The military officers saw a pattern in the data and wanted to reinforce where the holes were. Wald saw the absence of a pattern and realized where the holes should be but weren't. This is the essence of mathematical thinking: not just asking what the data says, but asking what the data would look like if something important were missing.
Consider what it took for Wald to see this. The military had brilliant engineers, experienced pilots, and statisticians. They had analyzed thousands of planes. But they were looking at the wrong denominator. They were asking "where are the holes on the planes that returned?" when they should have been asking "what's different about the planes that didn't return?" The first question gives you an answer about survivable damage. The second question — the one that requires imagining data you'll never see — gives you the answer about fatal damage.
The Bias You Can't See
Survivorship bias is the logical error that comes from concentrating on the things that made it past some selection process, while overlooking those that didn't. It hides by definition: you can't observe what's been removed from the sample.
And it doesn't just give you wrong answers. It gives you wrong answers that feel incredibly right. The data looks clean. The patterns look strong. The narrative practically writes itself. You might think you're doing rigorous analysis, but actually you're reading a story with half the characters deleted. That's what makes survivorship bias so dangerous — it masquerades as insight.
The bias operates like a filter that we forget is there. Imagine trying to understand "what makes a successful book" by only reading bestsellers. You'd conclude that books need to be exactly 287 pages, feature a protagonist with a troubled childhood, and include at least one scene in a coffee shop. The thousands of unpublished manuscripts with these same features — the ones that got rejected by every agent in New York — have vanished from your sample. You're not studying what makes books succeed; you're studying what successful books have in common, which is a different question entirely.
This distinction matters because our brains are pattern-matching machines. We see a cluster of successful people with similar traits and our minds immediately construct a causal story. Steve Jobs wore black turtlenecks. Successful startup founders drop out of college. Ancient Greek buildings used marble columns. Our brains scream: There must be a connection! And there might be — or it might be that black turtlenecks, college dropouts, and marble columns are common among failures too, but we've systematically eliminated those from our view.
The truly insidious part is that survivorship bias creates self-reinforcing cycles. Business schools study successful companies and teach their methods. Aspiring entrepreneurs copy those methods. Some succeed, most fail — but the ones who succeed get studied, their methods get taught, and the cycle continues. Meanwhile, the graveyard of failed companies using identical methods grows quietly, unobserved, its lessons unlearned.
The Mutual Fund Mirage
Every year, mutual funds that underperform get quietly shut down or merged. Their track records vanish. A 2011 Vanguard study found that of 1,540 U.S. equity funds available in 1997, only 55% still existed by 2011. Nearly half had been killed off, and the dead funds' dismal returns went with them.34
This creates a statistical illusion that would make Houdini jealous. Imagine you start with a thousand fund managers who have no skill whatsoever — they're just flipping coins, betting on stocks randomly. After one year, about 500 will be above average. After two years, about 250. After five years, roughly 30 will have beaten the market every single year purely by chance. These 30 survivors will be hailed as geniuses. They'll be profiled in magazines. They'll write books about their investment philosophy. And they'll be managing billions of dollars based on coin flips.
The financial industry has a term for this: "closet indexing." Many actively managed funds secretly track the index while charging high fees for the privilege of pretending to be special. But the deeper problem is survivorship bias itself — the dead funds don't just die, they vanish from historical databases as if they never existed. When you look at ten-year performance averages, you're not looking at all the funds that existed ten years ago; you're looking at the ones that survived, which by definition had better performance.
The Mutual Fund Graveyard
Start with 1,000 random funds (no skill, pure luck). Each year, eliminate the worst performers. Watch the survivors look like geniuses.
The survivors aren't skilled. They're lucky. But you'd never know it, because the unlucky ones have been erased from the record. The gap between the survivors' average return and the true average — the survivorship gap — can easily reach 1–2 percentage points annually. Over decades, that compounds into tens of thousands of dollars in illusory gains.
This is why the financial advice "find a fund that has beaten the market for ten years" is so dangerous. You're not finding skill; you're finding survivors. The funds that beat the market for ten years are precisely the ones still around to be found. The ones that failed are buried in unmarked graves, their investors' money gone, their lessons unlearned.
The Architecture of What Lasted
Walk through any European city and you'll hear a familiar refrain: "They don't build them like they used to." The old buildings — the Gothic cathedrals, the Georgian townhouses, the Roman aqueducts — are magnificent. The new buildings are glass-and-steel boxes that leak when it rains. The conclusion seems inescapable: craftsmanship has declined. We've traded beauty for efficiency, soul for spreadsheets.
But here's the thing: they built plenty of garbage in 1723. Shoddy timber-frame houses with crooked walls and leaking roofs. Warehouses that collapsed under their own weight. Entire neighborhoods of cheap, cramped housing thrown up by speculative builders cutting every corner. You've never seen any of it, because it fell down. What's left standing after three centuries is, by definition, the stuff that was built well enough to last three centuries. You're not comparing the average old building to the average new building; you're comparing the top 2% of old buildings — the ones that survived fire, flood, war, neglect, and the wrecking ball — to all new buildings, including the ones that will be demolished next Tuesday.
The same logic infects every nostalgic comparison. "Music was better in the '60s." Was it? Or have fifty years of cultural filtering removed "Yummy Yummy Yummy (I Got Love in My Tummy)" from the canon while preserving Abbey Road? "Classic literature is superior to modern fiction." Sure — if you're comparing the tiny fraction of nineteenth-century novels that professors still assign against the entire undifferentiated output of contemporary publishing, including the romance novels about werewolf accountants.
Nassim Nicholas Taleb calls this the Lindy Effect — the observation that for non-perishable things, every additional day of survival implies a longer remaining life expectancy.5 A book that's been in print for a hundred years will probably be in print for another hundred. But the Lindy Effect is a consequence of survivorship bias, not a correction for it. It tells you that survivors tend to keep surviving; it doesn't tell you that the things that died deserved to die, or that the things that survived are inherently better than what was lost. The Library of Alexandria burned. We'll never know what we lost. The silence of the dead texts isn't evidence that they weren't worth reading.
Billionaire Advice and the Graveyard of Startups
In 2005, Steve Jobs gave a commencement address at Stanford. "Follow your heart," he told the graduates. "Stay hungry. Stay foolish." It's one of the most-watched speeches in history. It's also, from a statistical standpoint, terrible advice.
Not because Jobs was wrong about his own life — he followed his heart, dropped out of college, started Apple in a garage, got fired from his own company, came back, and built the most valuable corporation on Earth. The advice is a perfect description of what Steve Jobs did. The problem is that it's also a perfect description of what thousands of failed entrepreneurs did. They followed their hearts too. They stayed hungry. They stayed foolish. They dropped out, maxed out credit cards, slept in their cars, pivoted seventeen times, and ended up bankrupt, divorced, or both. You've never heard their commencement addresses because nobody invites failures to give commencement addresses.
The math here is brutal and simple. Let's say 10,000 people drop out of college to start companies. Nine thousand nine hundred and ninety-nine fail and end up worse off than if they'd stayed in school. One becomes Bill Gates. Now someone writes a book about "What Bill Gates Did Right" and concludes that dropping out was a brilliant strategic move. But the Bureau of Labor Statistics consistently shows a gap of over $27,000 per year between those with only some college and those with a bachelor's degree.6 If dropping out were truly a winning strategy, we'd see it work consistently, not as a lottery with one winner and ten thousand losers.
This is the template for almost all success advice. We look at the survivors, identify their traits, and present those traits as a recipe — never checking whether the failures had the same traits. We study Gates's coding habits, his reading list, his management style, as if these were the causes of his success. But thousands of people have identical habits and never build Microsoft. The habits might be necessary conditions, but they're certainly not sufficient. And without studying the failures, we can't even say they're necessary.
Peter Thiel tells aspiring entrepreneurs that competition is for losers, that you should seek monopoly. Fine advice if you're Peter Thiel. But the cemetery of startups that tried to create monopolies and failed — the ones that burned through venture capital chasing a market that didn't exist, or that got crushed by an incumbent they thought they could displace — that cemetery is vast and silent. Nobody writes case studies about the company that tried to be the next PayPal and ran out of money in eighteen months.
Jim Collins's Good to Great studied companies that leaped from mediocre to sustained greatness — Circuit City, Fannie Mae, Wells Fargo. Those examples haven't aged well. But the deeper problem is methodology: Collins studied only companies that succeeded and asked "what did they do?" without systematically studying companies that did the same things and failed.7 He identified characteristics like "Level 5 Leadership" and "confronting the brutal facts." But did failed companies lack these traits? We don't know, because Collins never looked. He constructed a theory of success from a sample of successes, which is like constructing a theory of flight from a sample of planes that didn't crash.
Studying only the survivors makes their characteristics look causal when they may just be common.
The honest version of most billionaire advice would sound something like this: "I did these things, and I also got extremely lucky in ways I may not fully understand or be willing to acknowledge. Many people did similar things and failed. I can't actually tell you which of my decisions mattered and which were irrelevant, because I have a sample size of one — my own life — and no control group." That speech, while accurate, would not go viral on LinkedIn.
Ellenberg would recognize this as a problem of nonlinear thinking — the kind he spends his whole book trying to teach us. We want the relationship between effort and outcome to be linear: work hard, succeed proportionally. But entrepreneurship is wildly nonlinear. Tiny differences in timing, connections, market conditions, and sheer luck create enormous differences in outcomes. When you study only the winners, you flatten this complexity into a simple narrative. You turn a stochastic process into a fairy tale.
Silent Evidence
Nassim Taleb, in The Black Swan, gives survivorship bias its most evocative name: the problem of silent evidence.8 He tells the story of Diagoras of Melos, a Greek poet who lived in the fifth century BCE. Someone showed Diagoras a set of painted tablets depicting people who had prayed to the gods and survived a shipwreck. "See," they said, "proof that the gods answer prayers." Diagoras replied: "And where are the paintings of those who prayed and then drowned?"
Twenty-five centuries later, we're still making the same mistake. We see the tablets of the survivors and forget to ask about the drowned. The silent evidence problem is everywhere, once you learn to look for it. Medical research suffers from publication bias — studies that find a positive result get published; studies that find nothing get filed away. The published literature therefore overstates the effectiveness of treatments. Journalism suffers from it too: we report on the plane that crashed, not the 45,000 flights that landed safely, which makes flying seem more dangerous than it is. We report on the startup that became a unicorn, not the 999 that quietly dissolved, which makes entrepreneurship seem more promising than it is.
The deeper lesson — the one Wald understood in 1943, the one Taleb articulated sixty years later, the one Ellenberg would want us to carry in our mathematical toolkit — is this: before you draw conclusions from data, ask what's missing from the data and why. The most important information is often the information you don't have. The bullet holes that aren't there. The funds that no longer exist. The buildings that collapsed. The entrepreneurs who failed. The prayers that went unanswered.
Wald didn't have access to the missing planes. He couldn't examine them, measure their damage, or interview their crews. What he could do — what made him a genius — was reason rigorously about the absence. He could build a mathematical model that accounted for what he couldn't see. That's the skill this chapter asks you to develop: not just analyzing data, but analyzing the shape of what's missing from the data. It's harder than it sounds, because our minds are wired to work with what's in front of us. Seeing what isn't there requires a deliberate act of imagination.
So the next time someone tells you to study success — to read the habits of billionaires, to copy the strategies of market-beating funds, to emulate the buildings that lasted — pause. Ask the Wald question. Ask the Diagoras question. Ask: where are the ones that didn't make it, and what would they tell me if they could? The answer is usually: something completely different from what the survivors are saying.
Spot the Survivorship Bias
Can you identify which claims are distorted by survivorship bias?
"Eight out of ten successful restaurateurs say location is the key factor in their success."
"Exposed brick buildings from the 1800s are more structurally sound than modern drywall construction."
"A randomized controlled trial with 10,000 participants found that Drug X reduces blood pressure by 8%."
"College dropouts like Gates, Zuckerberg, and Jobs prove you don't need a degree to succeed."
Abraham Wald died in 1950, in a plane crash over India. He was fifty-two. There's a grim irony in that — the man who understood better than anyone how planes fail, killed by a failing plane. But his insight endures, not as a historical curiosity, but as a tool for thinking. Every time you look at data and draw a conclusion, ask yourself: am I looking at the bombers that came back?