All Chapters

The Missing Chapter

The House Always Wins

The mathematics of inevitable ruin — and why the small player loses, given enough time.

An extension of Jordan Ellenberg's "How Not to Be Wrong"

Chapter 1

The Run and the Fall

In 1992, a Greek-American high-roller named Archie Karas drove into Las Vegas with exactly fifty dollars in his pocket. Over the next two and a half years, he turned that fifty into forty million. He beat the best poker players in the world. He shot dice at Binion's Horseshoe with a bankroll so obscene the casino had to keep raising his limits. For a brief, luminous window, Archie Karas was the most successful gambler who ever lived.1

Then he lost it all. Every cent. In three weeks.

The story of Karas has been told as a morality tale. But it misses the most important part: the mathematics didn't care about his skill or his willpower. The math said he was always going to go broke. Not probably. Not likely. With a certainty that approaches the inevitable.2

This is the Gambler's Ruin problem, and it's one of the oldest in probability theory. Blaise Pascal and Pierre de Fermat — those famous correspondents who birthed modern probability in their 1654 letters about dice — were actually preceded by a less famous but equally sharp mathematician named Christiaan Huygens. In 1657, Huygens published a little book called De Ratiociniis in Ludo Aleae ("On Reasoning in Games of Chance"), and tucked inside it was a question that sounds almost casual: Two players, A and B, flip a fair coin. A has 12 chips, B has 9. They bet one chip each flip. What's the probability that A will bankrupt B?

The answer: 12/21, or about 57%. Not 50%, as you might naively expect. B starts with fewer chips, so B has a higher probability of ruin. This asymmetry — that ruin probability depends not on skill or luck but on starting position — is the first hint of something profound. The universe doesn't just roll dice; it counts your chips first.

A lone figure walking a tightrope over a vast chasm, representing the gambler's precarious path
Every step forward is a coin flip — but the chasm only needs to win once.
GAMBLER'S RUIN CASINO RUIN Your $k Short distance to ruin Ocean away from casino ruin
The gambler is a few steps from disaster; the casino is an ocean away.
Chapter 2

The Ruin Formula

Fair Game Ruin Probability
P(ruin) = 1 − k / N

You've got $20. The casino has $980. Your ruin probability is 1 − 20/1000 = 98%. In a fair game.3

Let's pause on that word: "fair." In probability theory, a fair game is one where the expected value of each bet is zero. Heads you win a dollar, tails you lose a dollar. No house edge, no loaded dice, no card counting or edge sorting. Just pure, crystalline fairness. And in this perfectly fair game, you will still go broke 98 times out of 100.

The variable k is your bankroll. The variable N is the total money in play — yours plus the casino's. If you have $k and the casino has $(N−k), your probability of eventually winning all the money is k/N, and your probability of ruin is 1 − k/N. Notice something elegant here: these probabilities sum to 1. There is no third option. In the long run, someone goes broke. The only question is who.

The formula has a curious property: it's linear. If you double your bankroll, you halve your ruin probability. If you want to cut your ruin probability from 98% to 49%, you need to match the casino dollar for dollar. This explains why casinos have table limits, why they frown at card counters, why the pit boss watches you with eyes that have seen a thousand Archie Karases. They're not worried about losing to skill. They're worried about the rare whale who shows up with a bankroll that approaches their own. That person changes the math.

But here's what most people miss: the formula applies to the casino too. If you somehow walked into Binion's Horseshoe with a bankroll equal to the casino's entire reserve, you would have a 50% chance of breaking them. The house edge doesn't matter in that scenario — what matters is the relative sizes. The casino's "edge" is really just a way to ensure that over time, they accumulate enough of a buffer that no single player can threaten their ruin. The house doesn't always win because they're smarter. They always win because they're bigger.

Ruin Probability Calculator

Your bankroll ($)$500
Casino bankroll ($)$10,000
Win probability49%
Ruin Probability
Expected Rounds
Chapter 3

The Drunk Walk

The classic way to visualize gambler's ruin is as a random walk — a drunk staggering between two walls, the left wall being bankruptcy and the right wall being the casino's ruin. You might think this is symmetric. After all, a step left and a step right are the same size. But here's what your intuition misses: the distance to each wall is wildly different. You're three steps from disaster; the casino is three thousand. That asymmetry is everything.

Mathematicians call these "absorbing barriers" — points where the walk terminates. Once you hit zero, you don't get to keep staggering; the game is over. The drunk doesn't bounce back from the wall. And because you're closer to your absorbing barrier than the casino is to theirs, the mathematics says you'll hit it first, almost every time.

There's a beautiful result here that connects to something called the "martingale" — not the betting system, but a type of stochastic process. In a fair game, your expected fortune at any future time is exactly your current fortune. If you have $100 now, your expected fortune after any number of fair bets is still $100. This seems to suggest that the game is balanced, that neither player has an advantage. But expectation is a sneaky thing. It tells you the average outcome over many parallel universes, not what happens in the one universe you actually live in.

In most of those parallel universes, you go broke. In a few, you win big. The average comes out to zero, but the median outcome — what happens to the typical player — is ruin. This is the difference between mean and median, between the arithmetic average and the typical experience. The casino doesn't care about the universe where you win millions; they care about the 98 universes where you leave with nothing. Across all possible futures, they collect your money.

Try the simulation below. Watch a thousand parallel universes unfold simultaneously. See how many make it to the green line — casino ruin — versus how many crash into the red line at zero. The paths that succeed have to overcome not just the coin flips, but the geometry of their starting position. They have to travel farther, on average, to reach their goal. And the longer the walk, the more opportunities for a run of bad luck to end the game.

You vs. The Casino

Watch 1,000 simultaneous random walks. How many survive?

Your bankroll$100
Casino bankroll$10,000
Win probability49%
Chapter 4

The Whale and the Velvet Rope

Here's a paradox that confuses beginners: if gambler's ruin says the house always wins, why do casinos treat big gamblers like royalty? Why the complimentary suites, the private jets, the lines of credit in the millions? Why does Steve Wynn personally greet the whales who could, theoretically, threaten his empire?

The answer lies in a distinction that gamblers often miss: the casino doesn't play the same game you do. When you sit down at a blackjack table, you're betting against the house. But when a whale sits down, they're often betting with the house — or at least, the house is betting that variance will save them.

Consider the mathematics. A typical high roller might have a million-dollar line of credit. A major casino might have a hundred million in cash reserves. By our formula, the whale's probability of breaking the casino is 1%. That's not zero. In fact, it's high enough that casinos have indeed been broken by single players. Akio Kashiwagi, a Japanese real estate tycoon, once won $10 million from Donald Trump's Taj Mahal in a single baccarat session in 1990. The casino had to scramble for liquidity.6

But here's the key: the casino doesn't need to win every session. They need to win over the long run. And the long run, for a casino, is measured in thousands of whales over decades. Each whale is a separate trial, and the law of large numbers ensures that the 1% who break the house are more than offset by the 99% who don't. The casino's "house edge" isn't really in the game odds — it's in the portfolio effect of playing thousands of independent ruin games simultaneously.

This is why casinos love volatility. In a strange twist, a high-variance player — someone who makes enormous bets, who swings wildly between wins and losses — is actually better for the casino than a grinder who makes small, conservative bets. The grinder reduces variance, extending the game, letting the law of large numbers work its magic. The whale concentrates the outcome into fewer trials. Sometimes they win big, yes. But more often, they hit their ruin barrier before they hit the casino's.

The velvet rope isn't a reward for good customers. It's a selection mechanism. The casino is saying: come in, take your shot at our $100 million with your $1 million. The odds are 99 to 1 in our favor. We'll even comp your room.

Professional gamblers know that the only way to beat gambler's ruin is to have a positive expected value — to play a game where the odds are actually in your favor. Card counters at blackjack, advantage players at poker, sports bettors with superior models — these are the rare exceptions who flip the sign on the equation. But even they face ruin. A card counter with a 1% edge and a small bankroll still has a significant chance of going broke before their edge manifests. This is why professional advantage players obsess over bankroll management, why they size their bets as a fraction of their total capital, why they practice what mathematician Edward Thorp called "fortune's formula" — the Kelly Criterion.7 In a sense, they're not trying to win the game; they're trying to survive long enough for the law of large numbers to rescue them from the jaws of ruin.

Chapter 5

Beyond the Casino Floor

Here's the thing about gambler's ruin that makes it more than a casino curiosity: the random walk doesn't care whether the "bets" are roulette spins or quarterly earnings reports. The same mathematical structure shows up anywhere a finite resource faces repeated random shocks with an absorbing barrier at zero.

STARTUPS The runway problem 18 months cash = small bankroll 90% fail → ruin probability ≈ 0.9 SPECIES EXTINCTION Small populations, big variance 50 condors = 50 chips Barrier at 0 = extinction LITIGATION Shallow pockets vs. corporate treasury Each motion = another bet Settlement ≈ ruin avoidance WAR OF ATTRITION Reserves vs. shoestring Each battle = coin flip Guerrilla = refusing the walk
Four faces of gambler's ruin: the same random walk, different domains.

Startups. A startup with 18 months of runway is a gambler with a small bankroll. Each month, the company either gets closer to profitability or burns more cash. The venture capital industry's obsession with "runway" is, mathematically, a ruin probability calculation. And the survival statistics are brutal: roughly 90% of startups fail, a number that rhymes suspiciously with the ruin probability of a small player against a vast and indifferent market.4

The startup world has developed its own folklore to explain this 90% failure rate. They blame bad product-market fit, incompetent founders, poor timing, competition. But the mathematics offers a simpler explanation: most startups are simply playing a game with too little capital relative to the variance of the outcome. Even good companies — companies that would eventually find product-market fit if they had infinite runway — hit the absorbing barrier of bankruptcy before the statistics can save them. The "lean startup" methodology, for all its virtues, can be read as a strategy for reducing burn rate, effectively increasing the number of "bets" a company can make before ruin. But it doesn't change the fundamental asymmetry: the market has more capital than you do.

Species extinction. A population of 50 endangered condors faces the same random walk. Each breeding season is a coin flip — births versus deaths — and the absorbing barrier is zero. Population ecologist Russell Lande showed in 1993 that small populations face an almost certain extinction probability over long enough time horizons, even when the average growth rate is positive. The variance kills them before the mean can save them.5

This has profound implications for conservation. It means that simply maintaining a population isn't enough — you need a buffer, a safety margin above the minimum viable population. The mathematics of gambler's ruin tells us that a species hovering at 50 individuals isn't "endangered" in the sense of "might die out." It's endangered in the sense of "will almost certainly die out, given enough time." The only question is whether that time is measured in decades or centuries. This is why conservation biologists focus so heavily on increasing population size — not just halting decline, but building the bankroll that gives the species a fighting chance against the random walk of birth and death.

Litigation. A plaintiff with shallow pockets suing a corporation with a legal budget that might as well be infinite. Every motion, every deposition, every continuance is another round of the game. The corporation doesn't need to win on the merits — it needs to keep playing until the plaintiff runs out of money. This is why most civil lawsuits settle: the smaller player recognizes the ruin problem. The settlement amount often has little to do with the merits of the case and everything to do with the relative bankrolls. It's not justice; it's arithmetic.

Wars of attrition. Two nations at war, one with vast reserves of manpower and materiel, the other operating on a shoestring. The larger nation doesn't need better tactics or superior strategy — it just needs to prolong the conflict. Each battle is a coin flip. The smaller nation might win some early victories, might even achieve favorable kill ratios. But the absorbing barrier is closer for them. This is why guerrilla warfare developed as a strategy: the only way for the smaller power to win is to change the game entirely, to refuse to play the attrition game, to make the cost of each "bet" so high that the larger power chooses to stop playing.

If you are the smaller player, the optimal strategy is not to play a long game. It's to play a short one. Make fewer, larger bets rather than many small ones. Shorten the random walk so variance works in your favor before the long run kills you. This is why underdogs in basketball play fast, why guerrilla armies avoid pitched battles, and why poker players with short stacks go all-in. The math is the same everywhere: when you're closer to ruin than your opponent, embrace volatility. It's the only thing that can save you.

Chapter 6

The Infinite Bankroll

There's a mathematical concept that makes gamblers uncomfortable: the idea of an infinite bankroll. In the limit, as the casino's resources approach infinity, your probability of ruin approaches 1. Certainty. Not "probably" or "likely" but will happen. Against an infinite bankroll, even a fair game becomes a death sentence.

This isn't just theoretical abstraction. In many real-world contexts, one player effectively does have infinite resources relative to the other. The U.S. government playing against a tax protester. Evolution playing against an individual species. Time itself playing against every finite entity that ever existed. These are games where one side can wait forever, can absorb any loss, can keep playing until the inevitable happens.

The mathematician Abraham de Moivre — who, ironically, died in poverty despite his contributions to probability — studied this problem in the 18th century. He wanted to know: how long does a game of gambler's ruin last, on average? His answer is elegant and terrible. In a fair game, the expected duration is k(N−k) — your bankroll times the casino's bankroll. If you have $100 and they have $9,900, the expected game length is 990,000 rounds. That's a lot of coin flips. But remember: this is the average duration. Many games end much sooner. And every single one ends with someone at zero.

The infinite bankroll is the mathematical expression of a harsh truth: in any long-running competition, the player with more staying power wins, all else being equal. This is why "persistence" is such powerful advice for entrepreneurs, why "survival" is the first principle of evolution, why Sun Tzu emphasized the importance of supply lines. The player who can keep playing wins by default when the other player hits their absorbing barrier.

But here's a twist: what if both players have infinite bankrolls? This seems like it would create an eternal stalemate, a game that never ends. And mathematically, that's exactly what happens. With infinite resources on both sides, the random walk wanders forever, never hitting either barrier. This is the mathematical justification for why monopolies and duopolies can persist indefinitely — when both sides have effectively infinite resources, competition becomes a forever-war with no resolution. It's also why regulatory intervention often focuses on breaking up concentrations of power: to restore the finite nature of the game, to reintroduce the possibility of ruin.

Chapter 7

The Geometry of Time

You might think Archie Karas lost forty million dollars because he was reckless. But actually, the recklessness is almost beside the point. The theorem was going to get him eventually. That's what gambler's ruin means: in a game with any house edge at all, played long enough, the probability of ruin for the smaller player approaches certainty.

And here's the truly uncomfortable implication: even in a fair game — no house edge, fifty-fifty odds — a player with finite resources facing an opponent with vastly more resources will still go broke with near certainty. The asymmetry isn't in the odds. It's in the bankrolls. The smaller stack gets absorbed by the larger one, the way a puddle evaporates and the ocean doesn't.

Let's think about what "long enough" means. In de Moivre's formula, time scales with the product of the bankrolls. If you're playing against a player with ten times your money, the game might last ten times longer than if you had equal bankrolls — but your chance of winning is still only 1/11. Time becomes a weapon for the larger player. They don't need to beat you quickly; they can afford to wait. The longer the game goes on, the more opportunities for variance to swing against you, the more likely you are to hit that left wall.

Karas knew the odds as well as anyone. What he didn't respect was time. He kept playing — through the $40 million peak, through the first big losses, through the point where the smart move was to walk away and never return. But walking away requires something that mathematics can describe but never provide: the willingness to stop.

There's a curious parallel here to the "gambler's fallacy" — the belief that a run of bad luck must be "due" for a correction. Karas didn't fall for that; he understood that each bet was independent. What he fell for was something more subtle: the belief that because he had beaten the odds before, he had some special power over them. This is the "hot hand" fallacy applied to one's own life. But the mathematics doesn't care about your narrative. It just keeps counting down the rounds until absorption.

The house doesn't need to cheat. It just needs to be bigger than you and more patient than you. Time does the rest.

Consider what happens as time approaches infinity. In the language of probability, the event "gambler eventually goes broke" has probability 1 — we say it happens "almost surely." This is a technical term with a precise meaning: there exist paths through the probability space where the gambler never goes broke, but their total measure is zero. In the limit, they're infinitely unlikely. The universe where Archie Karas retired with his forty million and lived happily ever after exists in the mathematical formalism, but it's a set of measure zero. The overwhelming majority of universes end the way ours did: with him back at zero.

This is the lesson that connects Karas in his booth at Binion's to the startup burning cash to the endangered species clinging to existence: in any repeated game against a larger opponent, survival is not the default. Survival is the thing that requires active, conscious, mathematically informed strategy. Everything else is a random walk toward zero.

The mathematics offers no comfort, but it does offer clarity. It tells us exactly what we're up against. It tells us that the velvet rope is a trap, that the comped suite is part of the game, that the casino doesn't fear the whale — they fear the player who knows when to stop. The house always wins not because they're evil or cheating, but because they understood the geometry of ruin long before you sat down at the table.

The only question left is: will you?

Notes & References

  1. Michael Konik, The Man With the $100,000 Breasts (Huntington Press, 2002).
  2. Christiaan Huygens, De Ratiociniis in Ludo Aleae (1657).
  3. Sheldon Ross, A First Course in Probability, 10th ed. (Pearson, 2019).
  4. Scott Kupor, Secrets of Sand Hill Road (Portfolio, 2019).
  5. Russell Lande, "Risks of Population Extinction from Demographic and Environmental Stochasticity and Random Catastrophes," The American Naturalist 142, no. 6 (1993): 911–927.
  6. Richard W. Munchkin, Gambling Wizards (Huntington Press, 2002). See also John L. Smith, Running Scared: The Life and Treacherous Times of Las Vegas Casino King Steve Wynn (Barricade Books, 2001).
  7. Edward O. Thorp, Beat the Dealer (Vintage, 1966). See also William Poundstone, Fortune's Formula (Hill and Wang, 2005).