From Mass to Leverage
Vol. 4, No. 7
Scale used to come from size. Now it comes from leverage–and technology decides who has it.
There’s a well-known rule at Amazon that teams should be small enough to be fed with two pizzas. The idea is simple: once coordination grows beyond a certain point, it starts to slow everything down.
That intuition is now playing out across entire systems. The internet has started to reshape what scale in size represents, making it a less reliable signal of success. When Facebook acquired Instagram in 2012, the company had 13 employees and was already serving millions of users. When WhatsApp was acquired for $19 billion, it had around 55 employees.
Scale continues to matter in infrastructure, where it underpins how systems run. But above that layer, its role is shifting. What used to require large organisations can now emerge from much smaller units, operating with a level of leverage that was previously out of reach.
Even so, the reflex of scaling as a success metric has imprinted itself into how we think, almost as a default. Adding more people is often seen as the path to stronger outcomes. The current wave of AI points in a different direction, expanding what individuals can do with agents and small constellations of them.
Over time, this reflex has shaped organisations into bureaucracies, where slowing things down becomes a byproduct of the system itself. Structures tend to reward coordination, which gradually outweighs progress. At some point, keeping things moving starts to look like a disruption.
The assumption that scale will save you is dangerous.
As coordination work moves out of process, it concentrates in people. Decisions matter more, defaults start to matter, and the question of what to build and why becomes harder to delegate. The centre of gravity shifts toward direction, requiring a different kind of attention to what is emerging.
From here, scale becomes something you build into the system, not something you carry in the organisation. And the constraint returns to something more human: how much complexity you can hold and how much direction you can set. Sometimes, the limit is still as simple as how many people you can feed with two pizzas.
Why we need to rethink scale
Production efficiency no longer requires accumulated volume when a single developer with AI tools can write code, design interfaces, and manage infrastructure that would have required twenty people five years ago. Market power through distribution is being disrupted by AI-driven content, community-based growth, and platforms that allow small producers to reach global audiences without large sales organizations. Talent aggregation is becoming less critical when AI agents can perform an expanding range of specialized tasks. And organizational resilience, while still a genuine advantage of large organizations, matters less when smaller organizations can operate with lower fixed costs and therefore survive disruptions that would force layoffs and restructuring at larger rivals.
Rita McGrath | 7 Minutes
The Permission Pipeline
The future will be shaped by defaults. What runs automatically. What requires a tap. What requires a signature. What triggers a pause. Most organizations may treat these as “process.” That’s a mistake. Defaults are policy. Gates are power. Whoever configures the gates is making the real decisions about how fast you’re allowed to move. When the busywork of coordination disappears, the real coordination ( the kind nobody had a meeting template for) doesn’t get cheaper. It becomes much more expensive AND unavoidable. Judgment under uncertainty. Taste. Responsibility. Trust. Narrative. These aren’t soft skills. They’re what’s left when the permission pipeline stops being a full-time job for half the company.
Greg Ceccarelli | 4 Minutes
The Human Alignment Problem
If the bottleneck is no longer intelligence but desire — if the limiting factor on the most powerful technology ever created is the quality of what we bring to it — then we are increasingly moving out of a technical conversation and into a spiritual one. I want to make the case that spirituality has always been, at its core, about the clarification of desire.1 And as the bottleneck of these emerging systems shifts to desire, our interaction with AI becomes, whether we recognize it or not, an encounter with our own depths and spiritual nature. The question AI is asking each of us, with enormous power behind it, is the same question the contemplative traditions have been asking for millennia: What do you actually want?
Daniel Thorson | 14 Minutes
Stop Designing the End State. Start Sensing What’s Next.
First, technology is lowering the barriers to entry to participate, which started with the internet. Today, anyone can jump in and become part of an economic unit. Second, how does that change organisations? How do organisations need to position themselves within that? Because the economy is fragmenting into miniatures, into smaller units of business owners. And third, how do we make sense of that? Who gets to participate? How do we empower them? How do organisations understand what is happening in the ecosystem around them?
In that sense, as a futuring architect, I’m in an orientation and sense-making role, trying to understand what is emerging and how we can unlock the potential hidden in technology with the organisation, the people, and the teams around me.
Ron & Adriana in Whatever Next Unplugges | 25 Minutes


