All Chapters

The Missing Chapter

The St. Petersburg Paradox

A game worth infinite money that nobody wants to play

An extension of Jordan Ellenberg's "How Not to Be Wrong"

Chapter 20

The Most Generous Game Ever Invented

In 1738, a young Swiss mathematician named Daniel Bernoulli presented a paper to the Imperial Academy of Sciences in St. Petersburg. The paper described a coin-flipping game so generous that its expected payout was literally infinite. And yet, as Bernoulli noted, no reasonable person would pay even 25 rubles to play it. The gap between mathematical expectation and human willingness to pay would haunt economics for the next three centuries.

Here's the game. I flip a fair coin. If it comes up heads, I pay you $2 and flip again. If it comes up heads again, I double your payout to $4 and flip again. I keep flipping and doubling until the coin finally lands tails—at which point you collect whatever has accumulated. So: one head then tails pays $2. Two heads then tails pays $4. Three heads pays $8. Ten heads in a row? That's $1,024. Twenty heads? Over a million dollars.

Now the question: how much would you pay for the privilege of playing this game once?

Take a moment. Think about it. Pick a number.

If you're like most people, you said something between $5 and $25. Maybe $10 feels about right. Maybe you'd stretch to $20 if you were feeling bold. Almost nobody says more than $50.1

And that's the paradox. Because according to the mathematics of expected value—the cornerstone of probability theory, the tool we use to price everything from insurance policies to stock options—the fair price of this game is infinite.

Expected Value of the St. Petersburg Game
E = ½ × $2 + ¼ × $4 + ⅛ × $8 + ···
Each term equals $1. There are infinitely many terms. The sum diverges to infinity.

Each possible outcome contributes exactly $1 to the expected value. The probability of getting exactly k heads before the first tails is (½)k, and the payout is $2k. Multiply them: always $1. Sum infinitely many $1's and you get infinity. Q.E.D., as mathematicians like to say right before the philosophers start yelling.

• • •

Your Turn at the Table

Don't take my word for it. Play the game yourself—hundreds of times, thousands of times—and watch what happens to your average payout. According to theory, the more you play, the higher that average should climb, forever approaching infinity. In practice? Well, let's see.

St. Petersburg Game Simulator

Set a ticket price and play. Watch how your cumulative profit evolves. The average payout should go to infinity... eventually.

0
Games Played
$0
Total Spent
$0
Total Won
$0
Avg Payout
$0
Net Profit
$0
Best Single Game
Average Payout
Ticket Price

Did your average payout shoot off to infinity? Almost certainly not. You probably saw it hover around $4–$8 for long stretches, then occasionally spike when a rare long streak of heads occurred, then settle back down. That spike—that's the paradox in action. The expected value is carried almost entirely by absurdly rare events. In theory, they'll show up. In practice, you'll die waiting.2

• • •
Chapter 20

Bernoulli's Brilliant Fix

Daniel Bernoulli's insight was beautifully simple: the mathematical value of money is not the same as the human value of money. Winning $2 when you have nothing is a very different experience from winning $2 when you already have a million dollars. The tenth dollar matters more than the ten-thousandth.

Bernoulli proposed that people don't maximize expected money—they maximize expected utility, where utility is a concave function of wealth. His specific proposal: utility is proportional to the logarithm of wealth.3

Dollars → Value → Linear (EV) Log (Utility) The gap between these curves is the St. Petersburg paradox

Linear expected value says every dollar matters equally. Logarithmic utility says each additional dollar matters a little less. The gap between these two views is where the paradox lives.

Under log utility, the expected utility of the St. Petersburg game converges to a finite number. The "fair price" drops from infinity down to something quite modest—around $4 to $6 for someone with typical wealth. Bernoulli had essentially invented diminishing marginal utility a century before the neoclassical economists would rediscover it and claim credit.4

A dollar is not a dollar is not a dollar. The first $100 you earn in a month keeps you alive. The last $100 buys slightly nicer wine. Expected value treats them identically. Expected utility does not. That's not a bug in human reasoning—it's a feature.

This was a genuinely radical move. Bernoulli was saying that pure mathematics—the expected value calculation—was giving the wrong answer to a practical question. Not because the math was wrong, but because the math was answering the wrong question. The right question isn't "what is this game worth in dollars?" but "what is this game worth to you?"

• • •

Explore the Utility Landscape

Bernoulli used the logarithm, but he could have chosen other concave functions. Each one captures a different flavor of risk aversion. A square-root utility function, for instance, implies someone who's risk-averse but less dramatically so than the logarithm suggests. Play with the explorer below to see how your choice of utility function changes the "fair price" of the St. Petersburg game.

Utility Function Explorer

See how different utility functions change the fair price of the St. Petersburg game. Adjust your current wealth and risk aversion to see the effect.

Your Current Wealth
$10,000
Max Flips to Consider
25
Linear (EV)
Logarithmic
Square Root
Linear
Logarithmic
Square Root

Notice something? As your wealth increases, the fair prices for log and square-root utility go up—because losing $20 matters less when you have $100,000 than when you have $500. But they never go to infinity. Diminishing marginal utility always tames the infinite sum.5

• • •
Chapter 20

But Wait—Bernoulli Didn't Actually Fix It

Here's the dirty secret that textbooks often skip: Bernoulli's solution doesn't really solve the problem. It just pushes it one level deeper.

In 1934, the mathematician Karl Menger pointed out that you can construct a "Super St. Petersburg" game where the payouts grow so fast that even the logarithmic expected utility diverges to infinity.6 Instead of paying $2k for k heads, pay $e2k. The logarithm of an exponential-of-exponential still blows up. For any unbounded utility function, there exists some version of the paradox that breaks it.

So the real resolution requires one of two moves:

Two Escape Routes

1. Bound your utility function. Declare that there's some maximum happiness a person can experience, no matter how much money they receive. Above some wealth level, you simply can't get any happier. This kills all Super St. Petersburg games dead.

2. Bound the game itself. No casino has infinite money. No game runs forever. If the casino's bankroll is, say, $1 trillion, then the maximum number of flips is about 40, and the expected value becomes a finite (if still large) $40. Suddenly the whole paradox evaporates.

Most economists today take some combination of both approaches. Utility functions are assumed to be bounded or at least to not grow faster than the payouts shrink in probability. And real-world constraints—finite lifetimes, finite wealth, the heat death of the universe—impose natural limits on any actual game.

But the conceptual problem remains: expected value can give you answers that are, in a deep sense, useless. And that's worth remembering every time someone quotes you an expected value as though it settles an argument.

• • •
Chapter 20

Where You've Already Met This Paradox

You might think the St. Petersburg paradox is a cute philosophical toy with no real-world consequences. You would be wrong. It's hiding inside some of the most important financial decisions of our time.

Startup Valuations

A venture capitalist evaluating a pre-revenue startup is playing a version of the St. Petersburg game. The expected value calculation says: there's a 0.001% chance this company becomes the next Google (worth $1 trillion), so the expected value of that outcome alone is $10 billion × 0.001% = $10 million. Run through enough possible outcomes and the expected value of any startup can be made to look enormous.

But VCs don't invest based on expected value alone. They invest based on something much closer to expected utility—shaped by portfolio constraints, fund lifetime, the difference between a 3× return and a 100× return for their career. The St. Petersburg paradox is the reason why "this startup has infinite upside" is not, by itself, a compelling investment thesis.7

Where the Expected Value Lives 99.9% of games: win $2–$16 $1 $1 $1 $1 Each contributes $1 to EV 0.1% of games: win $1K+ $1 $1 $1 ··· Also $1 each—but infinitely many! The infinite EV comes from events you'll almost never see

Every possible outcome contributes exactly $1 to the expected value. The paradox is that infinitely many tiny-probability outcomes contribute just as much as the common ones.

Insurance Pricing

Insurance is the St. Petersburg paradox in reverse. You're paying a certain amount (the premium) to avoid an uncertain catastrophe. Expected value says you should never buy insurance—the premium is always higher than the expected loss (otherwise the insurance company would go broke). But expected utility says you should, because the utility cost of financial ruin vastly exceeds the utility cost of a modest premium.

Bernoulli would be pleased. He essentially invented the mathematical justification for the entire insurance industry 250 years before it became a multitrillion-dollar business.

Lottery Tickets

And then there's the lottery, which is the St. Petersburg game's evil twin. The expected value of a lottery ticket is negative—you lose about 50 cents on every dollar. Yet people buy billions of dollars worth. Why? Partly because the utility of dreaming about winning is itself worth something. Partly because the utility function curves the other way for small amounts: losing $2 is nothing, but gaining $100 million is life-changing. For small stakes and huge payoffs, people become risk-seeking, not risk-averse.8

Daniel Kahneman and Amos Tversky formalized this observation in their prospect theory—which, in a beautiful historical rhyme, essentially said the same thing Bernoulli said in 1738, just with better data and a more complicated function.

• • •
Chapter 20

What the Paradox Really Teaches

The St. Petersburg paradox isn't really about a coin-flipping game. It's about the limits of any single number to capture what we care about.

Expected value is a magnificent tool. It's the foundation of probability, statistics, insurance, finance, decision theory. But it has a hidden assumption: that you should be neutral about risk. That a 50% chance of $200 is exactly as good as a guaranteed $100. For small stakes, this is roughly true. For large stakes—stakes that could change your life, ruin you, or make you rich beyond imagining—it is absurd.

"The value of an item must not be based on its price, but rather on the utility it yields." — Daniel Bernoulli, 1738

Every time you see an expected value calculation in the wild—"this policy will save an expected 10,000 lives," "the expected return on this investment is 12%"—ask yourself: what does the distribution look like? Is the expected value dominated by a few extreme scenarios that will probably never happen? Are the stakes large enough that diminishing marginal utility matters?

The St. Petersburg paradox is, at bottom, a reminder that mathematics is a language for describing reality, not a replacement for thinking about it. The expected value formula is speaking clearly. The question is whether it's saying anything useful.

Bernoulli figured this out in 1738, in a cold room in St. Petersburg, watching coins spin in the air. Three centuries later, we're still learning the lesson.

Notes & References

  1. Hacking, Ian. "The Emergence of Probability." Cambridge University Press, 1975. Surveys of willingness to pay for the St. Petersburg game consistently show median values of $5–$20.
  2. The convergence of the sample mean in the St. Petersburg game is extraordinarily slow. Feller showed that after n games, the average payout is approximately ½ log₂(n), not infinity. See Feller, W. "An Introduction to Probability Theory and Its Applications," Vol. 1, Wiley, 1968.
  3. Bernoulli, Daniel. "Specimen Theoriae Novae de Mensura Sortis" (Exposition of a New Theory on the Measurement of Risk), 1738. Translated in Econometrica, Vol. 22, No. 1 (1954), pp. 23–36.
  4. The marginalist revolution of the 1870s—Jevons, Menger, Walras—rediscovered diminishing marginal utility independently. Bernoulli's priority is acknowledged but was largely forgotten until the 20th century.
  5. For log utility with initial wealth W, the fair price of the St. Petersburg game (truncated at n flips) satisfies: the expected log utility of paying c and receiving the payout equals the log utility of keeping your money. This yields a finite c for any finite or infinite n.
  6. Menger, Karl. "Das Unsicherheitsmoment in der Wertlehre." Zeitschrift für Nationalökonomie, 5 (1934), pp. 459–485. Menger showed that any unbounded utility function admits a St. Petersburg-style paradox.
  7. Kerr, William R. and Nanda, Ramana. "Financing Innovation." Annual Review of Financial Economics, Vol. 7 (2015), pp. 445–462. Discusses how VCs manage fat-tailed return distributions.
  8. Kahneman, Daniel and Tversky, Amos. "Prospect Theory: An Analysis of Decision under Risk." Econometrica, 47(2), 1979, pp. 263–291. The paper that launched behavioral economics shows that people overweight small probabilities and are risk-seeking for losses.