Julie Cohen’s Oligarchy, State, and Cryptopia is a bracing account of how a handful of technology companies can move beyond regulatory arbitrage to something more ambitious: remaking the rules by which they are governed. The article’s core claim is that some groups of tech elites do more than evade oversight: they reconfigure the administrative state to relocate meaningful rulemaking into private hands.
Cohen’s analysis clarifies a particular form of power and why several familiar toolkits in the law, such as antitrust, fail at addressing it. From the many explanations that emerge from Cohen’s comprehensive framework, three are worth noting. Today’s tech elites fit the description of oligarchs not because they are rich, but because they can use their wealth for infrastructures that enable them to produce private rules (including both self-regulation and private governance) insulated from democratic accountability. So, programs of AI governance should consider political economy because the firms that build and operate the infrastructure also shape the State that might regulate them. An extension of this idea is that privacy law’s traditional focus on individual consent misses the point because the risk that privacy law should be addressing is the structural concentration of informational power.
A change from evasion to reconfiguration
Drawing on Jeffrey Winters, Cohen treats oligarchy as politics in which extreme personal wealth is deployed to obtain systemic advantage. Oligarchic power can coexist with any constitutional form and shifts along a spectrum of different modes, depending on how it interacts with institutions. Oligarchy, State, and Cryptopia shows that many leading tech executives function as oligarchs through the infrastructure they own and the governance they hardwire into it.
Familiar explanations of noncompliance, therefore, understate the phenomenon. Cohen documents a pattern of defiance of public governance and laws that goes from “move fast and break things” to orchestrated reg-neg campaigns that aim to reshape the scope of oversight. Various behaviors such as blitzscaling and participatory governance that seem unrelated make sense in a framework that shows how they combine to limit accountability. And this configuration also helps explain an occasionally fraught relationship between big tech and States: firms’ hybrid placement in terms of modes of oligarchy shows why they are unusually resistant to traditional enforcement.
It is worth noting that this configuration is different from a return to the Gilded Age. Nineteenth-century industrialists and financiers controlled the economy; the power of today’s tech elite is more multidimensional because everyone else depends on the oligarchs’ infrastructure to speak, transact, and sometimes govern. The upshot is that tech executives embed governance structures (and occasionally bake dependencies) into privately provisioned infrastructure from social media platforms to satellite systems as well as capital arrangements that bypass traditional forms of accountability.
AI governance needs political economy
Because, as Cohen shows, tech elites govern people, markets, and occasionally norms themselves through their infrastructure, treating AI regulation as a narrow technical problem is a mistake. Cohen’s analysis uncovers that the project marketed as deregulation is not about the decentralization of power or decision-making: leading actors lean on the State where it serves them though contracts, subsidies, favorable institutional redesign, and they seek to reconfigure it when it does not, most visibly in political efforts pitched as efficiency-enhancing. Cohen explains that the absence of regulation is an invitation for private power to consolidate rule. AI governance, in that context, is largely a question of who controls the levers of how the systems are deployed, who wins and who loses with the exercise of that control, and how that control interacts with political power.
The AI research ecosystem illustrates these points. Compute-intensive science relies on infrastructure and monetary resources that few firms can supply, influencing the scientific inquiries. Talent flows to the private sector, publication is conditioned by trade secrecy, and debate is shaped by private sector priorities.
Cohen shows, in sum, why AI governance must engage with capital structure that entrenches founder control and the infrastructure that governs privately, as well as an ideology that normalizes the decisions made in both. Technical checklists like model evaluations, watermarking, predetermined risk tiers are popular with regulators, but they cannot confront or substitute for confronting the move that puts essential informational infrastructures in private hands. Longtermism supplies a moral endorsement for the behavior that Oligarchy, State, and Cryptopia explains: safeguard long-term aggregate utility by consolidating control today.
Privacy law misses the structural concentration of informational power
The article also makes visible what privacy law often misses: risks in the information economy do not simply come from data extraction, but from the structural concentration of informational power. Regulatory regimes that center on each individual cannot counter a system whose leverage point is upstream; the law must operate at the same scale as the problem.
Privacy law has long tried to regulate the tech industry through individual consent and control. Cohen’s framework explains why this effort is misaligned with the problem. First, power is exercised at the collective level; it shapes economic and social conditions for populations who cannot reasonably leave its reach. That form of power is poorly addressed by tools designed for discrete harms to individual privacy. Second, when informational infrastructures double as governance mechanisms (e.g., by controlling access to resources or prioritizing speech) then accountability must operate at that level too. Cohen’s diagnosis is trenchant: “there is no particular reason to think” that a toolkit built for ordinary corporate power can remedy tech oligarchy due to its combination of infrastructure control, wealth, and ideological certainty.
This article matters practically
Cohen’s article, finally, is useful. Through its understanding of tech elites as oligarchs (hybrids who exercise personal control and selectively embrace institutionalism in ways that cement their authority), this article offers a framework to think about a wide set of legal, social, and economic problems. This unique form of power relocates governance from public law into private infrastructure. AI intensifies this shift, and individualistic data rights are too small for the job. Oligarchy, State, and Cryptopia gives readers a vocabulary and a map about what oligarchy is, how those who control big tech depart from earlier elites, which institutional levers matter, and why debates that orbit “content moderation,” “individual control over data,” or “AI safety” alone will keep missing the center of gravity.






