The Journal of Things We Like (Lots)
Select Page
evelyn douek, The Rise of Content Cartels, Knight First Amendment Inst. at Columbia Univ. (2020).

Content moderation is a high-stakes, high-volume game of tradeoffs. Platforms face difficult choices about how aggressively to enforce their policies. Too light a touch and they provide a home for pornographers, terrorists, harassers, infringers, and insurrectionists. Too heavy a hand and they stifle political discussion and give innocent users the boot. Little wonder that platforms have sometimes been eager to take any help they can get, even from their competitors.

evelyn douek’s The Rise of Content Cartels is a careful and thoughtful exploration of a difficult tradeoff in content-moderation policy: centralized versus distributed moderation. The major platforms have been quietly collaborating on a variety of moderation initiatives to develop consistent policies, coordinated responses, and shared databases of prohibited content. Sometimes they connect through nonprofit facilitators and clearinghouses, but increasingly they work directly with each other. douek’s essay offers an accessible description of the trend and an even-handed evaluation of both its promise and its perils.

Take the problem of online distribution of child sexual abuse materials (CSAM). There is a broad consensus behind the laws criminalizing the distribution of CSAM images, such images have no redeeming societal value, and image-hashing technology is quite good at flagging only uploads that are close matches for ones in a reference database. Under these circumstances, it would be wasteful for each service to maintain its own database of CSAM hashes. Instead, the National Center for Missing and Exploited Children (NCMEC) maintains a shared database, which is widely used by content platforms to check uploads.

douek traces the spread of the NCMEC model, however, to other types of content. The next domino to fall was “terrorist” speech: not always so clearly illegal and not always so obviously low-value. The Global Internet Forum to Counter Terrorism helps the platforms keep beheading videos from being uploaded. There have been similar initiatives around election interference, foreign influence campaigns, and more. I would add that technology companies have long collaborated with each other on security and anti-spam responses (often with law enforcement in the room as well) in ways that effectively amount to a joint decision on what content can and cannot transit their systems.

When there are so few platforms, however, content collaboration can become content cartelization. The benefits of cartelization on content moderation are many. Where there is an existing consensus on which content is acceptable, policy enforcement is more effective because platforms can pool their work. Even where there is not, platforms can learn from each other by sharing best practices. Some coordinated malicious activity is hard to detect when each platform holds only one piece of the puzzle; botnet takedowns now involve industry partners in dozens of countries. And to be effective, bans on truly bad actors need to be enforced everywhere, or they will simply migrate to the most permissive platform.

But douek smartly explains why content cartels are also so unsettling. They make it even harder to assess responsibility for any given moderation decision, both by obscuring who actually made it and by slathering the whole thing in a “false patina of legitimacy.” They amplify the existing “power of the powerful” by removing one of the classic safety valves for private platform speech restrictions: alternative avenues for the speaker’s messages. And, much like economic cartels, they present decisions made in smoky back rooms as though they were the “natural” outcomes of “market” forces.

douek’s explanation of how coordinated content moderation stands in sharp contrast to the rhetoric of competition these companies normally adopt is particularly sharp. Even the name itself–content cartels–points out the way in which this coordinated behavior raises questions of antitrust law and policy. To this list might be added the danger that content-moderation creep will turn into surveillance creep as platforms decide that to make decisions about their own users’ posts, they need access to information about those users’ activities across the Internet.

The Rise of Content Cartels resists the temptation to cram platform content moderation into a strictly “private” or strictly “public” box. Like douek’s forthcoming Governing Online Speech: From ‘Posts-As-Trumps’ to Proportionality and Probability, it is thoughtful about the relationship between power and legitimacy, and broad-minded about developing new hybrid models to account for the distinctive character of our new speech and governance institutions.

It is an exciting time for content-moderation scholarship. Articles from just five years ago read as dated and janky compared with the outstanding descriptive and normative work now being published. douek joins scholars like Chinmayi Arun, Hannah Bloch-Wehba, Joan Donovan, Casey Fiesler, Daphne Keller, Kate Klonick, Renee DiResta, Sarah T. Roberts, and Jillian C. York in doing important work in this urgently important field. To borrow a phrase, make sure to like and subscribe.

Download PDF
Cite as: James Grimmelmann, Content Cartels and Their Discontents, JOTWELL (April 13, 2021) (reviewing evelyn douek, The Rise of Content Cartels, Knight First Amendment Inst. at Columbia Univ. (2020)), https://cyber.jotwell.com/content-cartels-and-their-discontents/.