I am a daily practitioner of AI. I build with it, I rely on it, and I am optimistic about its potential to remove drudgery from our work.

But while I am an enthusiast for the technology, I am deeply skeptical of the narrative.
We are told that AI abolishes scarcity. Big Tech sells a vision where intelligence becomes “too cheap to meter”, lifting humanity into a post-scarcity utopia.
But if you audit the actual architecture of this economy, you realize that AI does not abolish scarcity. It reorganizes it.
The Anaesthesia of Cheap Content
We are currently witnessing a massive divergence.
• The Output is Free: The marginal cost of generating text, images, and code has collapsed. The world is flooded with synthetic media that looks like “value.”
• The Input is Priceless: The infrastructure required to create that value, advanced fabrication, high-bandwidth memory, energy contracts, and specialized talent, is becoming aggressively scarce.
This is the Abundance Trap. The flood of cheap content acts as a political and economic anesthesia. It distracts us with the illusion of “magic” while the ownership class locks down the physics of the system.
We are confusing the faucet with the reservoir.
The New Choke Points
This structure is not a democracy; it is a high-tech caste system disguised as a utility.
In this new stack, power doesn’t come from having the best idea; it comes from having the reserved capacity.
Hyperscalers are currently locking in multi-year energy and compute contracts, effectively leaving the rest of the market, startups, researchers, and entire nations in the Global South, bidding for leftovers.
The class division is mechanical:
• The Landlords: Those who own the “physics” (Data Centers, Weights, Energy).
• The Tenants: Those who rent the API endpoints.
If you are just fine-tuning a model on someone else’s infrastructure, you don’t have leverage. You have a subscription.
The Atrophy of Human Agency
My deepest worry isn’t just about who captures the profit. It is about what this structure does to us as builders.
I talk to engineers every day who feel a strange mix of relief and emptiness. When you solve a complex problem with a single prompt, you feel the rush of velocity, but you lose the texture of the struggle.
This “Rentier Architecture” creates a psychological dependency.
When we build on top of closed, proprietary APIs, we are technically “shipping,” but we are losing the Right to Tinker.
We are becoming operators of a machine we are forbidden to understand.
• The Owner (Big Tech) learns from every interaction, deepening their understanding of the world.
• The Tenant (Us) gets the output, but our understanding of the underlying mechanics atrophies.
We are at risk of raising a generation of “Senior Prompt Engineers” who have never looked inside the black box. If we don’t demand access to the weights, not just to use them, but to study them, we aren’t just losing economic leverage.
We are losing the intellectual dignity of our craft.
We must fight for open systems not just to keep prices low, but to keep our curiosity alive. We need to ensure that the future of intelligence is something we can take apart, fix, and learn from, not just something we subscribe to.
Beyond the “Social Media” Error
We have seen this movie before, but the stakes are higher now.
In the 2010s, we delegated the “Public Square” to private algorithms optimized for engagement. The result was a decade of polarization and a complete loss of control over our information diet.
But AI is more systemic. It doesn’t just intermediate communication; it intermediates decision-making. It allocates resources, writes code, and filters reality.
If we allow the governance model of Social Media (shareholder responsibility) to govern Artificial Intelligence, we are setting ourselves up for a catastrophic failure of public trust.
You cannot run the operating system of society on a “move fast and break things” license.
Governance at the Scale of Power
Power this immense cannot be left to self-regulate.
We need to stop treating AI like consumer software and start treating it like nuclear energy or climate infrastructure.
The risks sit at the intersection of security, economics, and fundamental rights.
I am not suggesting we nationalize the code, but we need binding constraints on the entities that build these high-capability systems.
We need oversight on the scale of the United Nations.
• Public Utility Frameworks: We must treat foundational compute access like an essential facility. It cannot be rationed solely by price.
• Independent Audits: We need agencies with the clearance to inspect the “black box”, checking for safety and bias before deployment, not apologizing for it after.
• Sovereign Resilience: Nations and civil society need “public options”, open infrastructure that ensures research and innovation aren’t locked behind a corporate paywall.
Conclusion
If we don’t intervene structurally, the default outcome is not a broadly shared future. It is a hierarchy built on a substrate of “free” intelligence, where the benefits accrue strictly to those who own the choke points.
We need to regulate the access before the cement sets. True abundance isn’t just about having cheap tokens; it’s about having a seat at the table where the weights are set.
