All Chapters

The Missing Chapter

When Being Right Isn't Enough

A great bet at the wrong size is worse than a mediocre bet at the right size.

An extension of Jordan Ellenberg's "How Not to Be Wrong"

Chapter 1

The Gambler Who Knew the Odds

In the winter of 1961, a skinny math professor named Edward Thorp walked into a casino in Reno, Nevada, with ten thousand dollars that wasn't his. The money belonged to two wealthy gamblers — Emmanuel Kimmel and Eddie Hand — who'd bankrolled Thorp after reading about his blackjack research in the Washington Post. Thorp had done something no one had done before: he'd proven, mathematically, that a card counter could gain a genuine edge over the house. The casinos weren't random. They were beatable.1

But here's the thing almost nobody talks about: Thorp already knew how to win. The harder problem — the problem that would consume the next fifty years of his career and make him hundreds of millions of dollars — was figuring out how much to bet.

Roulette wheel with mathematical formulas spiraling outward into darkness
Where probability meets fortune — the mathematics hiding inside every spin.
K A f* = 2p − 1
A lone mathematician doing mental arithmetic while the dealer turns cards.

Picture this. You've cracked the code. You know the deck is rich in tens and aces, and the math says you have a 2% edge on the next hand. So you shove all your chips in, right? You have an edge. You're right.

And this is where most smart people go broke.

Thorp didn't figure this out alone. A year earlier, at MIT, he'd knocked on the office door of Claude Shannon — the father of information theory, arguably the most brilliant scientist in America — to discuss his blackjack research. Shannon was intrigued. The two men spent hours at Shannon's home in Winchester, Massachusetts, a sprawling Victorian crammed with unicycles, flame-throwing trumpets, and homemade robots. Amid this chaos, Shannon pointed Thorp to a 1956 paper by a Bell Labs physicist named John Larry Kelly Jr.2

Kelly's paper had nothing to do with gambling. It was about a technical problem in information theory: how to maximize the growth rate of wealth when you receive noisy signals over a telephone line.3 But its implications were explosive. Kelly had derived a formula — elegant, simple, and deeply counterintuitive — that told you exactly how much to bet when you have an edge.

And the formula's most important lesson wasn't about what to do. It was about what not to do.

· · ·
Chapter 2

The Gambler Who Bets Too Much

Let's start with a thought experiment. I'm going to offer you a bet that's absurdly in your favor. A coin flip, but rigged: 60% chance you win, 40% chance you lose. When you win, you double your bet. When you lose, you lose your bet. You start with $1,000, and you're going to flip this coin 100 times.

What fraction of your bankroll should you bet each time?

Your expected value is positive on every single flip. The arithmetic says bet big. Bet everything. After all, each bet has a positive expected return of 20 cents on the dollar. The more you bet, the more you expect to make.

Try it and see what happens.

Kelly Criterion Betting Simulator

You start with $1,000. The coin lands heads 60% of the time at 1:1 odds. Choose what fraction of your bankroll to bet on each of 100 flips.

Bet fraction 25%

If you just ran that simulation — and you should, because the feeling matters more than the explanation — you noticed something strange. Betting 100% of your bankroll every time, which maximizes your expected value on each individual flip, almost certainly leaves you broke. Not unlucky-broke. Mathematically-certainly broke. One loss wipes you out, and in 100 flips, you will lose at least once.

Okay, 100% is obviously dumb. But what about 80%? Or 50%? This is where it gets weird. Even betting 50% of your bankroll — with a 60% chance of winning each flip — will probably ruin you over 100 rounds. You'll have spectacular runs where you're up 10x, followed by a string of losses that erases almost everything.

The problem is that your bankroll doesn't grow by addition. It grows by multiplication. Each bet multiplies your wealth by some factor. And when you're multiplying, a different kind of average matters.

Chapter 3

Two Averages Walk Into a Bar

Here's the mathematical heart of it.

Say you bet 50% of your $1,000. You win (60% chance), and now you have $1,500. You lose (40% chance), and you have $500. Your expected wealth after one round is:

0.6 × $1,500 + 0.4 × $500 = $1,100

A 10% expected gain. Beautiful. But now play two rounds. If you win then lose (or lose then win), you have:

$1,000 × 1.5 × 0.5 = $750

You won exactly half your bets with a positive-expected-value wager, and you lost money. The arithmetic mean — the expected value — says you're making 10% per round. But the geometric mean — the thing that actually determines your long-run compound growth — says something different.

Geometric Growth Rate
g(f) = p · log(1 + f) + (1 − p) · log(1 − f)
This is what Kelly said to maximize. Not expected value. The expected logarithm of wealth.5
Kelly optimal: f* = 20% Arithmetic return Geometric growth zero growth ≈ 80% Safe but slow Overbetting: more risk, LESS growth Going broke Bet fraction f → 0% +
The arithmetic line keeps going up. The geometric curve turns down. The gap between them is where fortunes disappear.

For our 60/40 coin at even odds, you can take the derivative, set it to zero, and get the Kelly fraction:

The Kelly Fraction
f* = 2p 1 = 2(0.6) − 1 = 0.20
Bet 20% of your bankroll. Not more. Not less (if you want maximum growth).
The Overbetting Penalty

The zone between Kelly and all-in is worse than the zone between zero and Kelly. Overbetting doesn't just add risk — it reduces your growth rate. Bet 40% instead of 20%, and you're taking more risk for less reward. You're paying a penalty for your aggression.

Split image of a cautious bettor carefully counting chips versus an aggressive bettor pushing all-in
Two bettors, same edge, different fates — the fraction you bet matters more than the edge you have.
Chapter 4

The Kelly Formula

Let's write it more generally. You have a bet where you win with probability p, and when you win, you get b dollars for every dollar wagered (and lose your dollar when you lose). The Kelly fraction is:

General Kelly Criterion
f* = p (1 − p) / b
If negative, don't bet — you have no edge.4
f*
Optimal fraction of wealth to bet
p
Probability of winning
b
Net odds — profit per dollar wagered on a win

Kelly Criterion Calculator

Input your edge and odds to find the optimal bet size.

Win probability 60%
Win payout (per $1 wagered) $1.00
Kelly Optimal Bet
20.0%
Growth at Kelly
+2.0%
Growth at ½ Kelly
+1.5%
Growth at 2× Kelly
+0.0%
Expected Return/Bet
+20%

Play with that calculator. Notice something? Even Thorp's blackjack edge — maybe a 2% advantage in the best situations — calls for betting only about 2–4% of your bankroll on any given hand. Not 10%. Not 25%. The formula is conservative in a way that feels almost timid to people who know they have an edge.

That's the point.

Chapter 5

Why Geometric Means Rule the World

Kelly's formula works because of a mathematical fact about repeated multiplication that most people — including most very smart people — don't intuitively grasp.

When you play a game repeatedly, reinvesting your winnings each time, your long-run wealth isn't determined by the average outcome. It's determined by the typical outcome. And in a multiplicative world, those are very different things.

Two investments. Investment A returns +80% or −50% with equal probability. Expected return: +15% per period. Investment B returns +8% every time.

After two rounds of A, if you win then lose: $1,000 × 1.8 × 0.5 = $900. The most likely outcome — win once, lose once — leaves you down 10%.

Investment B after two rounds: $1,000 × 1.08² = $1,166.40. Boring. Reliable. Better.

The Average Gets Richer. You Get Poorer.

50 simulated wealth trajectories of Investment A (+80% or −50% each period) over 200 periods. Most paths collapse despite positive expected value.

Mean (ensemble average)
Median (typical outcome)
Individual paths

This gap between the average and the typical — between the arithmetic mean and the geometric mean — is one of the most important ideas in all of applied mathematics. It shows up in evolutionary biology, in population genetics, and in every investment decision you've ever made.6

Chapter 6

Fractional Kelly: The Real World

A road lined with gravestones showing percentages — 50%, 80%, 100% — marking the overbetting path to ruin
The road to ruin is paved with overbets — each gravestone marks a fraction that seemed like a good idea.

Now here's where I have to be honest with you about Kelly's limitations, because the formula in its pure form assumes you know things you almost never know.

It assumes you know your exact edge. In blackjack, Thorp could estimate it — the math is clean, the deck is finite. But in the stock market? In a startup investment? In your career bet on a new industry? You think you have an edge. You might be right. But you're not sure, and if you overestimate your edge, the Kelly formula tells you to bet too much.

The Asymmetry of Error

Underbetting by half (half-Kelly) costs you only 25% of your maximum growth rate. But overbetting by double (double-Kelly) gives you zero growth — you're running in place while taking enormous risk. This asymmetry is why serious practitioners use fractional Kelly.

"Half Kelly is a good general rule because it provides 75% of the growth with much less pain."
— Edward Thorp

This maps to a broader life principle. When you have an edge — in a career move, an investment, a business decision — the biggest danger isn't betting too small. It's betting too big. The cost of underbetting is linear: you grow a bit slower. The cost of overbetting is catastrophic: you can lose everything.

Chapter 7

The Bet You're Always Making

Here's what makes the Kelly criterion more than a gambling trick or an investing technique. It's a model for every decision where you reinvest the consequences.

You're choosing between two job offers. One is safe: steady salary, clear trajectory, modest upside. The other is a startup: it could make you rich, it could waste three years of your life. Kelly doesn't say take the safe job. And it doesn't say take the risky one. It says: how much of your life are you betting?

25 35 45 55 65 Big bet OK f* = larger Careful f* = smaller Protect Your Kelly fraction should shrink as the consequences of ruin grow.
Life as a sequence of bets — your optimal fraction depends on what you can afford to lose.

If you're 25 with no dependents and a safety net, you can afford a larger Kelly fraction. The startup is a reasonable bet. If you're 45 with a mortgage and two kids in private school, the same bet might be catastrophic overbetting — not because the expected value is different, but because the consequences of ruin are different.

Edward Thorp understood this at a blackjack table in 1961. He spent the next six decades proving it, first in casinos, then on Wall Street — where his hedge fund, Princeton Newport Partners, returned nearly 20% annualized over two decades8 — then as one of the most successful investors in American history.7 And the deepest lesson he took from Kelly's formula wasn't about maximizing returns. It was about survival.

There are many degrees of success, but only one degree of ruin. You can only go broke once.

The Kelly criterion respects this asymmetry. It's the mathematics of not going broke — which turns out to be the most important mathematics of all.

Notes & References

  1. Edward O. Thorp, Beat the Dealer: A Winning Strategy for the Game of Twenty-One (New York: Random House, 1962). The book became a bestseller and was so effective that casinos changed their rules in response.
  2. Thorp describes this meeting in his memoir: Edward O. Thorp, A Man for All Markets (New York: Random House, 2017), pp. 43–52. Shannon and Thorp also built the first wearable computer together.
  3. John L. Kelly Jr., "A New Interpretation of Information Rate," Bell System Technical Journal 35, no. 4 (July 1956): 917–926.
  4. The Kelly fraction can also be written as f* = edge / odds. Derived by maximizing E[log(1 + f·X)].
  5. By the law of large numbers, (1/n)·Σ log(Xᵢ) → E[log(X)] almost surely. Since log is concave, Jensen's inequality gives E[log(X)] ≤ log(E[X]).
  6. See Ole Peters, "The Ergodicity Problem in Economics," Nature Physics 15 (December 2019): 1216–1221.
  7. Edward O. Thorp, "The Kelly Criterion in Blackjack, Sports Betting, and the Stock Market," in The Kelly Capital Growth Investment Criterion (2011), pp. 789–832.
  8. Princeton Newport Partners returned 19.8% annualized from November 1969 through December 1988.