The Physicist Who Told Economists They'd Been Wrong for 300 Years
In 2019, a physicist named Ole Peters published a paper in Nature Physics with one of the most audacious claims in the history of economics: that the entire field had been making a mathematical error since Daniel Bernoulli's 1738 paper on the St. Petersburg paradox.1 The error wasn't subtle. It wasn't a rounding issue or a contested assumption. It was a confusion between two fundamentally different kinds of average — the same confusion we explored in Chapter 2 on ergodicity, but now with consequences measured in trillions of dollars.
If you've read Chapter 2, you know the basic setup: a process is ergodic if the time average (what one person experiences over time) equals the ensemble average (what many people experience at one moment). Flipping a fair coin for additive bets is ergodic. But the moment you switch to multiplicative dynamics — where gains and losses are percentages of your current wealth — ergodicity breaks, and the two averages diverge in ways that can ruin you.
Peters's argument is that mainstream economics has been computing the wrong average for three centuries. And the implications, if he's right, ripple through everything: why people buy insurance, why inequality grows, why "rational" risk aversion isn't irrational at all, and why GDP growth can coexist with most people getting poorer.
Let's start with a coin flip.
The Gamble That Eats You Alive
Here's a bet. I flip a fair coin. Heads: your wealth increases by 50%. Tails: your wealth decreases by 40%. Do you take it?
An economist trained in expected utility theory would compute the expected value:
Five percent expected gain per round. A no-brainer, says expected value theory. You'd be a fool not to play.
But now actually play it. Start with $100. Suppose you get one head and one tail (in either order):
$100 → heads → $150 → tails → $90
Or equivalently:
$100 → tails → $60 → heads → $90
Either way, you lost $10. A "positive expected value" gamble left you poorer. That's not bad luck — that's mathematics. After one head and one tail, you always multiply by 1.5 × 0.6 = 0.9. You lose 10% per round-trip, guaranteed.
The time-average growth rate tells the real story:
The ensemble average grows at +5% per round. But your wealth — the wealth of any single individual playing repeatedly — shrinks at −3.4% per round. Play long enough and you go broke with probability one, even though the "expected value" is positive.2
How can both be true? Because the ensemble average is propped up by a tiny number of extraordinarily lucky individuals. Imagine a thousand people playing this game. After many rounds, the average wealth across all players is enormous — but that average is dominated by one or two people who happened to flip an improbable streak of heads. The median player is destitute. The mean is a liar.
The Simulator: Watch Inequality Emerge from a Coin
Don't take my word for it. Watch it happen. Below, 100 agents all start with $1,000 and play the same coin-flip game. The ensemble average (blue line) and median (red line) tell completely different stories. And the Gini coefficient — the standard measure of inequality — climbs relentlessly.
Multiplicative Wealth Simulator
100 agents play the +50% / −40% coin flip. Watch the ensemble average and median diverge.
After a hundred rounds, most players are effectively broke. A handful are fantastically wealthy. The average wealth is high — and totally meaningless as a description of anyone's actual experience. Sound familiar? This is the economy in miniature.
What Economists Got Wrong (According to Peters)
The standard story in economics goes like this. Daniel Bernoulli noticed in 1738 that people don't maximize expected wealth — they seem to maximize expected utility of wealth, where utility is some concave function like the logarithm.3 This was formalized by von Neumann and Morgenstern in 1944 into expected utility theory, which became the bedrock of modern economics.4
Peters's radical claim is that Bernoulli solved the right problem with the wrong method. People do act as if they're maximizing the logarithm of wealth — but not because they have logarithmic utility. They do it because they're maximizing their time-average growth rate, which, for multiplicative dynamics, is given by the expected value of the logarithm of returns.
— Ole Peters
This is a subtle but profound distinction. Expected utility theory says: "People are risk-averse because their brains curve the value of money." Peters says: "People are risk-averse because the mathematics of repeated multiplicative gambles requires it for survival." One is psychology. The other is physics.
In an ergodic process, it doesn't matter whether you compute the time average or the ensemble average — they're the same. Expected value maximization works fine.
In a non-ergodic process (like multiplicative wealth dynamics), they diverge. Expected value maximization can lead you to certain ruin. You must instead maximize the time-average growth rate.
The entire edifice of expected utility theory was an elaborate patch to fix the wrong answers that came from using the wrong average.
Try It Yourself: The Growth Rate Calculator
Here's the tool that reveals the trick. Plug in any gain percentage and loss percentage for a repeated 50/50 gamble. The calculator shows you both the ensemble expected value (what economists traditionally compute) and the time-average growth rate (what actually happens to you).
Growth Rate Calculator
Set the gain and loss for a repeated 50/50 gamble. See when "positive expected value" is actually toxic.
Play with the sliders. Notice how the gap between the two numbers is largest when the loss percentage is high. A +100%/−60% gamble has a fantastic expected value of +20% per round — and a time-average growth rate of −8.1%. The bigger the stakes, the more the two averages diverge, and the more dangerous it is to trust the ensemble average.
Why This Changes Everything
Inequality Isn't a Bug — It's a Theorem
If wealth dynamics are multiplicative — and they largely are, since investment returns, business growth, and asset appreciation all work as percentages — then inequality must increase over time. This isn't a policy failure. It's not caused by greedy CEOs or lazy regulators (though those don't help). It's a mathematical inevitability of multiplicative dynamics applied to a population.
The simulation above demonstrated this live. Even though every agent played the exact same game with the exact same odds, inequality exploded. No one cheated. No one had an unfair advantage. The Gini coefficient marched upward purely from the mathematics of multiplicative noise.
This has a profound political implication: without some mechanism of redistribution (taxation, social insurance, shared infrastructure), inequality doesn't just happen to grow — it's guaranteed to grow, as surely as entropy increases in a closed system.5
Risk Aversion Isn't Irrational
Traditional economics has a "risk aversion puzzle." People reject gambles that have positive expected value. They buy insurance even when the expected payout is negative. They diversify their portfolios when concentration would maximize expected returns. The standard explanation invokes the curvature of utility functions — people just feel that a dollar lost hurts more than a dollar gained helps.
Peters offers a simpler explanation: people aren't irrational. They're computing the right quantity — the time-average growth rate — and correctly recognizing that many "positive expected value" gambles are time-average losers. Insurance is a negative expected-value bet that increases your time-average growth rate by preventing catastrophic multiplicative losses. That's not irrational. That's arithmetic.6
The Kelly criterion — the optimal bet-sizing formula used by professional gamblers and quantitative investors — is time-average growth rate maximization. John Kelly derived it at Bell Labs in 1956 by asking: "What fraction of my bankroll should I bet to maximize my long-run growth rate?" The answer: maximize the expected logarithm of returns. Kelly was solving exactly the problem Peters says everyone should be solving.7
GDP and the Ergodic Illusion
When politicians say "the economy grew by 3%," they're reporting an ensemble average. GDP per capita is the total output divided by the total number of people — it's a mean, not a median. In a non-ergodic economy, the mean can grow while the median stagnates or declines.
This is not hypothetical. In the United States, real GDP per capita roughly doubled between 1980 and 2020. But real median household income grew by only about 20% over the same period. The ensemble average grew beautifully. The typical person's experience was... different.
The ergodicity framework explains this divergence without invoking conspiracy or moral failure. If returns to capital are multiplicative and unevenly distributed, you expect the mean to pull away from the median. The economy can "grow" while most people tread water. Reporting the mean as though it describes the typical experience is the ergodic fallacy applied to an entire civilization.
Not Everyone Is Convinced
It would be dishonest to present Peters's framework as settled science. It isn't. The debate is live, contentious, and sometimes acrimonious.
Many economists argue that expected utility theory already handles this correctly. If you use a logarithmic utility function — which many models do — you get exactly the same prescriptions as Peters's time-average approach. The math is equivalent; the philosophy is different but the predictions are identical. Why rewrite the foundations when the existing theory already gives the right answers, once you choose the right utility function?8
Peters's response is that this equivalence is precisely the problem. Expected utility theory gets the right answer for the wrong reason, and this leads to errors in cases where the two frameworks diverge — particularly in policy design and in understanding why people make the choices they do. Saying "people have logarithmic utility" is a psychological claim that requires justification. Saying "people maximize their time-average growth rate" is a mathematical consequence of the dynamics they face.
Other critics point out that real economic dynamics aren't purely multiplicative. People earn wages (additive). They face both additive and multiplicative shocks. The ergodicity framework is elegant for the coin-flip model but may not extend cleanly to the messier reality of actual economic life.
These are fair criticisms. But even skeptics tend to agree on the underlying mathematics: for multiplicative processes, the ensemble average and time average diverge, and the time average is what matters for individual outcomes. The argument is about how much of real economics this applies to — not whether the math is correct.
The Moral of the Story
The ergodic hypothesis in economics is, at its core, a story about which average matters. When someone tells you the average return of the stock market is 10% per year, ask: whose average? When a politician says GDP grew by 3%, ask: did I grow by 3%? When a casino advertises a game with positive expected value, ask: positive expected value for whom — for each individual player over time, or for the house's balance sheet across all players at once?
The mathematician's instinct, as Ellenberg teaches us throughout How Not to Be Wrong, is to ask the question behind the question. And the question behind "what's the expected value?" is always: expected value of what, computed how, and relevant to whom?
Peters may or may not be right that economics needs to be rebuilt from the ground up. But he is certainly right about this: when the process isn't ergodic, the ensemble average lies to you. And in a world of multiplicative wealth dynamics, compound interest, and exponential growth, the processes that matter most are almost never ergodic.
The economy is growing. You might not be.