The Bezos Cocktail Party
There's a cocktail party thought experiment that economists love, and it goes like this. You're in a room with nine other people. Everyone's got roughly the same net worth — say, $80,000 each. Not wealthy, not poor. The kind of people who worry about car repairs but can usually cover them. Teachers, nurses, office managers, that sort of thing. You've all had a glass or two of wine. The conversation is pleasant. Someone makes a joke about student loans.
Then Jeff Bezos walks in.
He doesn't make a grand entrance. He's just there, suddenly, holding a drink, wearing his usual vest. Maybe he nods at the shrimp cocktail. The average net worth of the room just shot past $10 billion.
Congratulations! You are now, on average, fabulously wealthy. You could, on average, buy a yacht. On average, you never need to work again. You are, mathematically speaking, richer than the GDP of several small nations combined. The average person in that room has more money than they could spend in a hundred lifetimes.
But check your bank account. Nothing's changed. You still have $80,000. You still have to go to work on Monday. The average has become a mathematical fiction that describes precisely nobody in the room — not the nine ordinary people, and not even Bezos himself, who has $150 billion, not $10 billion.
This isn't just a cute observation about income inequality, though it is certainly that. It's the entry point to one of the most profound — and most overlooked — ideas in the mathematics of decision-making. An idea that sat in plain sight for three centuries while economists built empires on its absence. An idea that explains why the advice you get from financial advisors might be perfectly wrong for you personally. Why insurance companies make money hand over fist while selling you "peace of mind." Why the house always wins, even when the odds look favorable.
The idea is called ergodicity. And the man who dragged it into the light — a physicist named Ole Peters — thinks it might be the biggest blind spot in all of economics.1
The word itself comes from statistical mechanics, where physicists used it to describe systems where, over long enough time scales, a single particle samples all possible states. But here's the kicker: most of economics assumes that people are like those particles. That your life, over time, will look like the average across many people right now. That the ensemble average — what happens to the group — is the same as the time average — what happens to you through time.
Peters looked at this assumption and said: prove it. And when he did the math, he discovered something shocking. The assumption fails catastrophically for most of the decisions that actually matter in human life.
The Coin Flip That Eats Your Money
Let me offer you a bet. A fair coin is flipped. Heads: your total wealth increases by 50%. Tails: your total wealth decreases by 40%.
Take a moment. Is this a good bet?
If you've taken an economics course, or read a personal finance book, or listened to a crypto bro on Twitter, you probably reached for expected value. You did the calculation in your head: 50% chance of +50%, 50% chance of -40%. The expected value calculation says: absolutely. On each flip, your expected gain is:
So let's play. You start with $100 — not enough to change your life, but enough to notice. You flip the coin. Heads! You now have $150. You feel smart. You flip again. Tails. You lose 40% of $150, dropping to $90. Hmm. You flip again. Heads: $135. Tails: $81. Another heads: $121.50. Another tails: $72.90.
Something is wrong. The math said you should be winning. But you feel like you're being slowly bled dry.
So let's scale this up. Let's run a simulation with 100 players, each starting with $100, each flipping 100 times. Here's what happens to the average wealth across all players: it goes up. Dramatically. Exponentially. After 100 flips, the average player has over $13,000. The expected value calculation was right!
But here's what happens to you, a single player playing through time: you almost certainly go broke. In fact, in our simulation of 100 players, roughly 95 of them will end up with less than they started. Many will be essentially ruined — their wealth rounded down to pocket change through the relentless arithmetic of repeated multiplicative shocks.
The Ergodicity Coin Flip
Each round: Heads → wealth × 1.5, Tails → wealth × 0.6. Watch the gap between ensemble average and individual experience.
How is this possible? How can a game with positive expected value destroy almost everyone who plays it?
The answer is multiplication. A 50% gain followed by a 40% loss doesn't leave you up 10%. It leaves you at:
Each "round trip" of one heads and one tails — which will happen, on average, every two flips — shrinks your wealth by 10%. Not because the coin is biased, but because the math of percentages is cruel and asymmetric. You can compound gains forever and lose everything to a single bad streak. Or more commonly, you can tread water through good luck and bad, while the arithmetic of multiplication slowly grinds you down.
The geometric growth rate — what actually happens to you over time — is:
The gap between the ensemble average (what happens to the group) and the time average (what happens to you) is the essence of non-ergodicity. And it's everywhere once you know to look for it.
Ergodic vs. Non-Ergodic
A system is ergodic when the time average equals the ensemble average. When the experience of one person over a long time is the same as the experience of many people at one moment. When you can learn everything you need to know about the future by looking at a snapshot of the present.
Think of a fair coin that pays $1 for heads and takes $1 for tails. This is ergodic. If 1,000 people flip once, about 500 win and 500 lose. The average is zero. If one person flips 1,000 times, they'll wander up and down but end up close to zero. The time average converges to the ensemble average. The math works.
But the multiplicative coin flip — the one that multiplies your wealth by 1.5 or 0.6 — is non-ergodic. Your personal trajectory diverges wildly from what the population average would predict. The ensemble average goes up because a tiny fraction of players get incredibly lucky, hitting long streaks of heads that compound into astronomical wealth. These outliers pull the average up even as the median — the typical player — plummets toward zero.
This is why the median matters more than the mean in non-ergodic systems. The median is what actually happens to a typical person. The mean is what happens to a fictional average that nobody experiences.
Ole Peters, working at the London Mathematical Laboratory starting around 2011, argued that this distinction isn't just a technical curiosity. It's the crack in the foundation of expected utility theory.3
In 1738, Daniel Bernoulli tackled the St. Petersburg paradox — a game with infinite expected value that no reasonable person would pay infinite money to play. His fix: people maximize expected utility of wealth, not expected wealth. And utility is logarithmic — the first dollar matters more than the millionth.
Peters argues Bernoulli got the right answer for the wrong reason. People avoid the St. Petersburg game not because of diminishing utility in some psychological sense, but because they're computing the time average — what would happen to them over repeated play. The logarithmic utility function happens to be the mathematical operation that converts ensemble averages into time averages for multiplicative processes. It's not psychology. It's dynamics.4
The Kelly Criterion: Betting Like Your Life Depends On It
There's a fascinating parallel to all of this in the world of gambling, discovered by a Bell Labs scientist named John Kelly in 1956. Kelly was working on information theory with Claude Shannon — yes, that Claude Shannon, the father of information theory — when he stumbled upon a question that seems unrelated to ergodicity but turns out to be the same mathematical insight wearing different clothes.
The question: if you have a betting edge — say, you know a coin is biased 60-40 in your favor — how much of your bankroll should you bet on each flip?
The naive answer: bet everything! You have an edge! The expected value is positive! But anyone who's actually tried this knows what happens: the first time you lose, you're wiped out. You had a 40% chance of ruin on the very first flip, and probability always catches up.
The timid answer: bet almost nothing. Play forever, never lose much, never win much. Preserve capital at all costs. But this leaves money on the table — or rather, leaves exponential growth on the table.
Kelly found the optimal answer: bet exactly the fraction of your bankroll that maximizes the geometric growth rate. For a coin that wins with probability $p$ and pays even money, the Kelly fraction is $2p - 1$. For our 60-40 coin, that's 20% of your bankroll per bet.
Notice what Kelly is doing. He's not maximizing expected value. He's maximizing the time-average growth rate — exactly the quantity that matters in non-ergodic systems. The Kelly criterion is the practical implementation of ergodicity economics.
Full Kelly betting maximizes the logarithm of wealth over time. The logarithm appears again — the same mathematical operation that converts ensemble averages to time averages. Kelly didn't know about ergodicity economics (the term wouldn't be popularized for another half-century), but he discovered the same truth: when outcomes multiply, you need to think in logarithms, not linear expectations.
Ed Thorp, the mathematician who beat blackjack and later ran the world's first quantitative hedge fund, used the Kelly criterion to turn a theoretical edge into hundreds of millions of dollars. Warren Buffett, though he doesn't frame it this way, invests in a Kelly-like manner — concentrated bets on his best ideas, never risking total ruin, always thinking about the compounding of wealth over time rather than the expected value of any single investment.
The Kelly criterion has a dark twin, though. Half-Kelly betting — betting half the optimal amount — reduces volatility significantly while only giving up 25% of the growth rate. Many professional gamblers and investors use fractional Kelly strategies because they recognize that the "optimal" strategy assumes you can handle the ride, and most human psychology can't. The math says bet 20%; your amygdala says please, for the love of God, bet 10%.9
The GDP Illusion
So far this might feel like an abstract mathematical point — ensemble averages versus time averages, geometric means versus arithmetic means, who cares? But here's where it gets political, and personal, and impossible to ignore.
GDP per capita — the total economic output divided by population — is an ensemble average. When economists announce that "the economy grew by 3%," that doesn't mean your income grew by 3%. The growth might be entirely concentrated among people whose wealth multiplies through investment returns, stock options, and capital gains while your wages stagnate through additive annual raises that barely keep pace with inflation.
The ensemble grew. The time average of a typical person did not.
This is not a conspiracy. It's not even a policy failure, exactly. It's a mathematical feature of non-ergodic economic systems. When wealth dynamics are multiplicative — when the rich get richer not by addition but by multiplication — the ensemble average can rise forever while the median stagnates or falls. The average person becomes poorer even as the average wealth goes up.
The Economy Grew. Did You?
GDP per capita vs. real median household income, United States, 1984–2023, indexed to 100. Both start at the same point — watch them diverge.
This is why GDP per capita can rise for a decade while median household income flatlines. It's why a politician can truthfully say "the economy is growing" while most people feel like they're treading water. It's why the "average American" has a higher net worth than ever while the typical American feels poorer than their parents.5
The ergodicity framework doesn't just describe this phenomenon — it predicts it. In a multiplicative wealth system, you expect the mean and median to diverge. You expect the average to become increasingly fictional, describing a world that exists for nobody while the typical experience stagnates. The math doesn't just allow this; it requires it.
Russian Roulette for the Soul
There's a darker flavor of non-ergodicity that doesn't just shrink your wealth — it eliminates you entirely. The mathematician Nassim Taleb calls this "absorbing barriers" or "ruin." The physicist Ole Peters calls it the difference between "growth" and "survival."
Imagine a game of Russian roulette. A revolver with one bullet and five empty chambers. Someone offers you a million dollars to pull the trigger once.
The expected value is astronomical. Five-sixths of the time, you get a million dollars. One-sixth of the time, you die. If we assign a value to human life — say, ten million dollars — the expected value calculation goes: (5/6 × $1M) + (1/6 × −$10M) = −$833,000. That comes out negative. Let's sweeten the deal — ten million dollars to play. Now: (5/6 × $10M) − (1/6 × $10M) = +$6.67M. Positive expected value!
But here's the thing about Russian roulette: you can only play it once. Or rather, you can only play it until you lose, which — if you keep playing — is guaranteed to happen eventually. The time average of Russian roulette is death with certainty. The ensemble average says "most people who play this game get rich." The time average says "everyone who keeps playing eventually dies."
This is the ultimate non-ergodic process. Death is an absorbing barrier. Once you hit it, you don't get to average out your results over future trials. You're done. The ensemble can continue — new players can take your place — but you are gone.
Real life is full of absorbing barriers. Bankruptcy is one — once you're bankrupt, you can't play anymore, so the fact that "most people eventually recover" is irrelevant to you. Prison is another. Reputational destruction is another. Climate tipping points might be another. In each case, the ensemble average tells a story of resilience and recovery, while the time average for any individual participant is a story of permanent exclusion.
Insurance companies understand this instinctively. They don't maximize expected value; they maximize survival probability across their portfolio. They diversify so that no single catastrophe can ruin them. They maintain reserves so they can survive runs of bad luck. They are, in effect, engineering an ergodic system out of non-ergodic individual risks — pooling time averages so that the ensemble behavior becomes predictive.
When you buy insurance, you're doing the opposite. You're accepting a negative expected value (the premium is more than the expected payout) in exchange for eliminating ruin risk. You're trading ensemble thinking for time-average thinking. And according to ergodicity economics, this isn't irrational — it's optimal.10
What's Your Bet Actually Worth?
Whenever someone offers you a gamble and describes it in terms of "expected value" or "average return," you should now know to ask: is this additive or multiplicative? Is there an absorbing barrier? Am I playing once or repeatedly? Do I care about the ensemble or my personal trajectory through time?
If outcomes add to your wealth (fixed dollar amounts), expected value is a fine guide. If outcomes multiply your wealth (percentages), expected value can lie to your face. The relevant quantity is the time-average growth rate.
If you can only play once, or if there's a chance of ruin, expected value may be meaningless. The relevant quantity is survival probability.
The Ergodicity Calculator
Enter a multiplicative gamble and see whether the ensemble and time averages agree or diverge.
Bernoulli's logarithmic utility function — the one that says the first dollar matters more than the millionth — happens to produce the same decisions as maximizing the time-average growth rate. The log of wealth is the utility function that makes the ensemble calculation give you the same answer as the time-average calculation. Bernoulli got the right answer for the wrong reason, or perhaps for a reason he couldn't fully articulate.7
But here's the deeper insight: the logarithm isn't just a psychological fudge factor to make the math work. It's the natural coordinate system for multiplicative processes. Just as you use logarithms to turn multiplication into addition — ln(a × b) = ln(a) + ln(b) — you use logarithmic utility to turn multiplicative wealth dynamics into additive utility dynamics. The log converts the non-ergodic into the ergodic.
You Already Knew This
You already knew this. Not in the formal language of ensemble averages and geometric growth rates, not in the vocabulary of ergodicity and absorbing barriers, but in your bones. You knew it every time someone told you "the average American has…" and you thought, that's not my life. You knew it when you decided not to bet your entire savings on that "sure thing" your cousin told you about. You knew it when you bought insurance even though "on average" it's a losing bet.
The mathematics of ergodicity gives a name to that feeling. It says: your instinct is correct. The average doesn't describe your life, because you don't live in the average. You live in a specific, singular, sequential path through time. You can't jump to another branch of the multiverse when your timeline goes wrong. You can't average across parallel versions of yourself. You get one ride, and the math of that ride is different from the math of the group.
This insight — once seen — can't be unseen. It changes how you think about investment advice, economic policy, and personal decisions. It explains why the rich get richer in ways that compound over generations. Why inequality isn't just a moral problem but a mathematical inevitability in non-ergodic systems. Why diversification isn't just a conservative strategy but a mathematical requirement for survival.
Insurance makes sense — not because you're "irrationally risk-averse," but because a catastrophic loss in a multiplicative world can't be undone by future gains. You're buying ergodicity.
Diversification makes sense — not as a hedge for nervous people, but as a mathematical strategy to make your personal process more ergodic. Don't put all your eggs in one basket because you only get one timeline.
Progressive taxation makes sense — not as charity, but as a collective mechanism to prevent non-ergodic concentration of wealth from eating the system alive. When wealth multiplies, you need countervailing additive forces.
Warning signs become clearer: anyone promising returns based on "average" performance in multiplicative systems is either lying or mathematically illiterate. The average is a useful fiction. Your life is real.8
There's a humility in this framework that's missing from traditional economics. Expected utility theory assumes you can know the probabilities, calculate the outcomes, and optimize accordingly. Ergodicity economics admits that you can't know which branch of the multiverse you'll end up in, so you optimize for the trajectory rather than the snapshot.
You don't live in the average. You live in time. And in time, the math is different.