The Napkin That Launched a Trillion Dollars
In 1980, Robert Metcalfe had a problem. He'd invented Ethernet — the technology that would eventually connect every computer on Earth — but nobody wanted to buy it. His company, 3Com, was hemorrhaging money. So Metcalfe did what any self-respecting engineer-turned-salesman would do: he made a slide.
The slide was beautifully simple. On the left, a single telephone. Useless. Who are you going to call? On the right, a web of connections between many telephones. Invaluable. In between, a curve shooting upward: the value of the network grows as the square of the number of users.
One fax machine is a paperweight. Two fax machines are a communication system. A million fax machines are a revolution. The math seemed bulletproof: if you have N people on a network, each person can connect with N − 1 others, giving you N(N − 1)/2 possible connections. For large N, that's roughly N²/2. Double the users, quadruple the value.
Metcalfe's Law
V = N(N − 1) / 2 ≈ N²
Where N is the number of connected nodes
It worked. Companies started buying Ethernet cards. Metcalfe got rich. And the idea burrowed deep into the collective brain of Silicon Valley, where it would incubate for two decades before hatching into something monstrous.
By the late 1990s, Metcalfe's Law had become the unofficial catechism of the dot-com boom. Every startup pitch deck featured some version of that upward-swooping curve. We don't need profits, founders argued, we need users. Get the users, and the value — the N-squared value — will take care of itself. Pets.com didn't need to sell dog food profitably; it just needed enough eyeballs for the network effects to kick in.1
The problem, of course, is that Metcalfe's Law is wrong. Not wrong like "the Earth is flat" wrong — it captures something real. But wrong in the way that matters most in mathematics: it's wrong about the exponent.
The Trouble with Counting Connections
Here's a thought experiment. You join Facebook. Your first friend is your spouse. Enormous value — now you can share photos, coordinate schedules, send messages. Your second friend is your best friend from college. Also great. Your fifth friend? Still solidly useful. Your fiftieth? Maybe it's that guy from the conference three years ago. You'll never message him, but sure, why not.
Your five-hundredth friend? You literally don't know who this person is.
Not all connections are created equal. Your 500th Facebook friend adds approximately zero value to your network experience. Metcalfe's Law says they're just as valuable as your first.
This was the devastating insight of Andrew Odlyzko and Benjamin Tilly, who published a critique in 2003 that should have been required reading for every venture capitalist in America.2 Their argument was elegant: yes, N nodes can form N²/2 connections. But "can" is doing an enormous amount of heavy lifting in that sentence. In practice, people don't maintain thousands of equally valuable relationships. They maintain a few close ones, some medium ones, and a long tail of near-meaningless ones.
If you rank your contacts by how much you actually interact with them, you get something that looks suspiciously like Zipf's Law: your k-th most contacted person gets roughly 1/k as much attention as your most contacted person.3 Your top contact might get 100 messages a month. Your 10th gets 10. Your 100th gets 1. Your 1000th gets... well, a LinkedIn birthday notification, if they're lucky.
Metcalfe assumes every connection is equally valuable (blue dashed). In reality, connection value follows a Zipf-like distribution (red curve) — a few connections matter enormously, most barely register.
When you sum up a Zipf distribution — 1 + 1/2 + 1/3 + ... + 1/N — you get approximately ln(N). So each user's value from the network is proportional to log(N), not N. Multiply by N users and the total network value is:
Odlyzko's Law
V = N · log(N)
Still grows with N, but much slower than N²
This is still superlinear — a network with 10 million users is worth more than ten times a network with 1 million users. Network effects are real. But N·log(N) is dramatically less than N². A network with a million users? Metcalfe says it's worth a trillion units of value. Odlyzko says about 20 million. That's a factor of 50,000.4
If you're a venture capitalist who bet a hundred million dollars on Metcalfe's Law being literally true, that's... not a rounding error.
Grow a Network
But enough theory. Let's actually watch these laws in action. Below, you can build a network node by node and see how Metcalfe, Odlyzko, and Reed each reckon its value. Watch how the predictions diverge as the network grows — slowly at first, then spectacularly.
Network Value Calculator
Add or remove nodes and watch three different valuation models diverge.
Notice what happens around 10–15 nodes. Metcalfe and Odlyzko are still in the same ballpark, but Reed's Law has already gone to the moon. By 20 nodes, Reed's valuation is over a million — which is why David Reed himself argued that group-forming networks (like WhatsApp groups or subreddits) are far more valuable than simple communication networks.5
Of course, this is also why Reed's Law is the most dangerous of the three. Saying a 20-person network is worth a million "units" is the kind of claim that makes mathematicians reach for their emergency whiskey. The problem is combinatorial explosion: 2²⁰ is about a million, meaning there are a million possible subsets of 20 people. But nobody is joining a million groups. Most of those subsets — "me, my dentist, and three strangers from Paraguay" — have no reason to exist.
Anthropologist Robin Dunbar famously proposed that humans can maintain about 150 stable relationships — a number dictated by the size of our neocortex. If Dunbar is right, then no matter how large the network grows, each user's "active network" caps out. Beyond 150 connections, Metcalfe's Law doesn't just overestimate — it enters pure fantasy.
Winner Takes All
But here's where network math gets politically interesting. Even if Metcalfe's Law overstates the absolute value of networks, it captures something crucial about relative value. If a network's worth grows faster than linearly with users — whether it's N², N·log(N), or something in between — then bigger networks are disproportionately more attractive than smaller ones.
This creates a vicious feedback loop. People join the bigger network because it's more valuable. Which makes it even bigger. Which makes it even more valuable. Which makes even more people join. Economists call this a "positive feedback" or "increasing returns" dynamic, and it leads inexorably to monopoly.6
The positive feedback loop that turns slight advantages into total market domination.
This is why the tech industry tends toward monopoly in network-dependent markets. It's not (just) because tech billionaires are ruthless. It's because the math demands it. Two competing networks of equal quality are an unstable equilibrium, like a ball balanced on a hilltop. The slightest breeze — a celebrity joining one, a server outage on the other — tips the balance, and positive feedback does the rest.
Try it yourself:
Tipping Point Simulator
Two competing networks. Set the initial market share and watch winner-take-all dynamics unfold.
Set the slider to 50/50 and hit run. Then try 51/49. Then 55/45. You'll notice that even a tiny initial advantage — a few percent — is enough to trigger total domination. The tipping point is razor-thin. This is why early-stage startups are so desperate to grow fast: not because growth is inherently good, but because in a network-effects market, being slightly behind is a death sentence.
The Mathematical Tragedy of MySpace
But the feedback loop can also spin in reverse. MySpace peaked at about 100 million users in 2008 — then collapsed. What happened? As the network grew, so did spam, bots, and garish profile pages that made the platform miserable to use. The quality of connections degraded faster than the quantity grew. Facebook offered a cleaner experience; a trickle of defectors became a flood, and the same positive-feedback math that built MySpace destroyed it. Friendster suffered an even more dramatic version of the same story — its servers literally couldn't handle the load, creating a "negative network effect" where more users made the experience worse.7
What the Data Actually Says
So which law is right? In 2015, researchers at Tencent tested the models against actual data from the company's networks. They found that network value scaled approximately as N^1.5 — right between Odlyzko's N·log(N) and Metcalfe's N². The truth, as is often the case in mathematics, was somewhere in the middle.8
Facebook's revenue data tells a similar story. Between 2010 and 2020, as monthly active users grew from about 500 million to 2.8 billion (roughly 5.6×), revenue grew from $2 billion to $86 billion (roughly 43×). Pure Metcalfe would predict 5.6² ≈ 31×. The actual growth exceeded even that — but mostly because Facebook got better at monetization, not because the raw network effects were N².
Facebook's actual revenue growth (gold) versus what Metcalfe and Odlyzko would predict from user growth alone. Revenue outpaced even N² — suggesting monetization improvements, not pure network value.
The honest answer is that network value doesn't follow any single clean power law. It depends on the type of network (communication? marketplace? social?), the quality of connections, and a dozen other factors that resist tidy formulas. Metcalfe's Law is less a law of nature and more a law of salesmanship — a useful heuristic that became dangerous when people forgot it was an approximation.
The Lesson
The story of Metcalfe's Law is really a story about the seductive power of simple mathematical models. N² is elegant. It fits on a napkin. It makes the direction of your graph go in the direction that makes investors excited. And it captures a genuine truth: networks do get more valuable as they grow.
But the gap between "more valuable" and "N-squared valuable" is the gap between a sensible investment and a catastrophic bubble. The dot-com crash of 2000 destroyed about $5 trillion in market value, much of it invested in companies whose entire business case rested on Metcalfe's Law applied to eyeball counts. In 2022, the crypto crash did something eerily similar, with network effects invoked to justify the "value" of tokens that nobody was actually using for anything.
The lesson of Metcalfe's Law is not that network effects are fake. It's that the exponent matters. In mathematics, the difference between N² and N·log(N) is the difference between a good investment and a Ponzi scheme.
Robert Metcalfe, to his credit, never claimed his "law" was anything more than an approximation for selling Ethernet cards. It was the rest of us who turned a sales pitch into a mathematical religion. The next time someone shows you a graph with an N² curve and asks you for money, remember: the connections you actually use are a tiny, Zipf-distributed fraction of the connections that are theoretically possible. And in that gap between possibility and reality lies the graveyard of a hundred billion-dollar startups.