Auto-generated description: A vibrant, multicolored pattern of computer circuit boards repeats across the image.

This is an excellent post which talks lucidly about what it means for power to be decentralised in the world of AI. The author, Alex Chalmers, argues that decentralisation is not automatically good; it only works when embedded in a framework that can coordinate local actors, define boundaries, and step in when things go wrong.

Chalmers draws on historical thinkers and different traditions, ultimately arguing that if we care about pluralism and autonomy, we should design bounded decentralisation with explicit constitutional guardrails. In other words, we shouldn’t just assume “more nodes” or “more voices” automatically means more freedom.

In today’s world, a handful of companies control the compute, data, and frontier models that are restructuring how billions of people interact with the world. Existing institutions are struggling to keep up. The concentration of power in AI labs is now one of the defining political questions of the decade.

Many are unhappy about this development, with groups like the AI Now Institute, the Distributed AI Research Institute (DAIR), and the Algorithmic Justice League arguing that AI development as currently constituted is irredeemably centralizing. They believe that we need to relocate authority away from corporations and regulators towards the communities most affected by these systems. When policymakers look for alternatives to the status quo of corporate self-governance and light-touch regulation, these groups are frequently in the room.

Ideas around participatory AI governance draw on a deep intellectual tradition that integrates technology and power, dating back to nineteenth century anarchism and running through twentieth century American social theory. While elements of the diagnosis have force, both the analysis and the prescriptions suffer from fatal flaws that become even more acute in the AI age.

[…]

If your starting premise is that human flourishing is what happens when the megamachine gets out of the way, you don’t need to weigh the goods it produces, because they aren’t really goods. You don’t need a theory of when expertise is legitimate, because expertise is a symptom of the problem. You don’t need mechanisms against capture, because capture is what happens under the current system and will dissolve along with it.

The intellectual apparatus is structured to avoid the questions that a functional governance regime has to answer. What looks like a program for radical democracy turns out to be a refusal of the conditions under which democratic decisions about complex systems can be made at all.

[…]

[G]overnance must go where the knowledge is. This could be professional bodies, academic institutions, or open-source communities. They would each govern usage within the domain where their members have the requisite competence and stakes. Fortunately, most of these institutions already exist. They do not need to be designed from first principles or assembled by the participation industry.

While the vast majority of governance questions are deployment problems where domain-specific institutions have the advantage, there are a handful of bigger challenges that sit above this. Problems like the security of frontier model weights or thresholds for certain dangerous capabilities sit at a higher layer that require a degree of either state or interstate coordination.

[…]

No quantity of nested enterprises can resolve the production-side concentration of frontier AI. A handful of labs control the most powerful models, and no amount of deployment-side checks and balances can change that.

But a thick ecosystem of intermediary institutions on the deployment side creates countervailing power. The labs must satisfy many masters rather than capturing one regulator, or, as the anarchist model would have it, being replaced by a constellation of community-run alternatives that will never match their capabilities.

[…]

Freedom has never depended on power being small. It has depended on power being answerable to more than one authority at a time, and on citizens belonging to institutions that can push back on their own terms. The task ahead of us is building that intermediary layer.

Source: Cosmos Institute

Image: Deborah Lupton