The Journal of Things We Like (Lots)
Select Page
Jasmin Brieske & Alexander Peukert, Coming into Force, Not Coming into Effect? The Impact of the German Implementation of Art. 17 CDSM Directive on Selected Online Platforms, CREATe Working Paper, available at SSRN (Jan. 25, 2022).

The European Union has been busy updating its regulation of online services in a variety of ways. This includes a recent directive that directs Member States to institute a new online copyright regime. Services that host user-generated content will be required to keep unlicensed works off of their sites, and also required to negotiate with copyright owner groups for licensing agreements. In essence, other hosting sites will have to behave like YouTube in its deals with major music and film labels. This new regime was imposed by what’s known as Art. 17 of the 2019 Directive on Copyright in the Digital Single Market (CDSM Directive). (The Digital Services Act further complicates the picture because it overlaps with the laws required by Art. 17 and adds to their requirements, but I will focus here on Art. 17.)

Unlike its content-agnostic counterpart the Digital Services Act, the copyright-specific Art. 17 does not itself have the force of law; it requires transposition into national law, and different countries have taken different approaches to that transposition. Germany’s transposition has been one of the most ambitious and user-oriented. Brieske & Peukert’s working paper Coming into Force, Not Coming into Effect? The Impact of the German Implementation of Art. 17 CDSM Directive on Selected Online Platforms explores how the new German regime affected—and didn’t affect—the copyright-related policies and practices of major sites. As it turns out, neither the user protections nor the rightsowner protections seem to have changed the practices of the big sites—giving more evidence that the major impact will be on smaller sites that may not even have had the problems that purportedly justified this new licensing-first regime. The piece is an important reminder that implementation is everything: New legislation is exciting and produces lots of work for lawyers, but that doesn’t mean it produces wider change.

As Brieske and Peukert explain, few EU member states actually met the deadline for transposition, due in part to the inconsistencies in Art. 17 itself: Supposedly, the directive didn’t require the use of automated filtering—but it imposed duties to prevent unauthorized uploads that could not practically be accomplished without filters, to be applied when rightsholders supplied information sufficient to identify their works. Art. 17 was also supposed to preserve some user rights, but current technologies don’t (and likely never will) identify non-copyright-infringing uses of works (“fair dealing” in Germany) such as reviews, quotations, and parodies in an automated way.

Germany’s implementation aimed to thread the needle by limiting automated blocking and creating a category of uses that are presumptively authorized by law and should not be blocked. A presumptively authorized use:

(1) contains less than half of one or several other works or entire images,

(2) combines this third-party content with other content, and

(3) uses the works of third parties only to a minor extent or, in the alternative, is flagged by the user as legally authorized. Minor uses are really minor, however: “uses that do not serve commercial purposes or only serve to generate insignificant income and concern up to 15 seconds of a cinematographic work or moving picture, up to 15 seconds of an audio track, up to 160 characters of a text, and up to 125 kilobytes of a photographic work, photograph or graphic.” Unlike fair use or even traditional fair dealing, this is a rule rather than a standard.

Moreover, providers have a duty to notify rightsowners when their identified works are used in minor or flagged ways, and offer an opportunity for the rightsowners to object, either via a takedown notice or, in cases involving “premium” content like live sports or current movies, via a “red button” that will immediately block access to the upload.

As is evident, this is a complicated system, perhaps rescued by the idea that mostly it won’t be used, since rightsowners have no real incentives to protest truly minor or critical uses. The German implementation also requires services to inform users about the existence of exceptions and (like the DSA) provide a dispute resolution procedure. Article 17 contemplates only an internal dispute resolution process, while the DSA will require the largest sites to provide for external arbitration as well.

Did all this complexity result in changes in the copyright policies of major sites? The authors studied “YouTube, Rumble (a smaller platform with similar functionality), TikTok, Twitter, Facebook, Instagram, SoundCloud and Pinterest.” The sites appeared not to change much or at all in response to the new German law, even when they had Germany-specific versions (as most did), although their policies also varied a fair amount across the entire group. Most notably, all the sites, with the exception of Twitter, were already using automated upload filters before they were required to do so. This result reflects what followers of research on the US DMCA have long known: Big platforms that experienced lots of unauthorized uploads had already transitioned away from reliance on notice and takedown and legal safe harbors, and towards using filtering, and often licensing, in “DMCA Plus” systems. Art. 17 thus didn’t change matters much if at all for those platforms, while potentially imposing expensive new duties on platforms that don’t have significant infringement problems.

The theoretical protections for users don’t seem to have done much. Likewise, the sites all already had internal dispute mechanisms, further indicating that market pressures were already producing some “due process” protections for users even without legal requirements. Larger sites may also be incentivized to do so by legal requirements: under the overlapping obligations of the DSA, very large sites will be required to provide outside arbitrators for appeals. Meanwhile, the sites didn’t seem to implement or tell users about the possibility of flagging an upload as authorized by law, and they also didn’t warn copyright owners of the possible penalties for repeated abuse of the system. The inefficacy of user protections  may be a harbinger of the fate of other attempts to inject users’ rights into systems predicated on broad copyright controls.

As the authors point out, the difficulties in passing implementing legislation across the EU made a “wait and see” approach reasonable for many platforms. The European orientation towards accepting good-faith attempts at compliance, unlike the usually more-legalistic American approach, may also play a role. With Content ID or similar filtering mechanisms and internal appeal options already in place, the fact that the details vary somewhat from the formal requirements of the law might readily seem low-risk. There was no widespread noncompliance with the protections for large copyright owners, who are the most likely to sue and the most expensive to defend against. Users whose fair dealing is blocked are more likely to complain online or give up, neither of which are nearly as damaging.

The authors’ results are consistent with a story of regulation lagging behind reality, and also of regulation being designed with only the big players in mind. Websites like Ravelry (focused on the fiber arts) don’t really need filters to prevent them from being hotbeds of copyright infringement; keeping the site on-topic suffices for that even though it allows lots of user-generated content. And it turns out that the DMCA-Plus sites that most people use most of the time already did filter and didn’t bother to change how they filtered just because they were supposed to respect user rights in the process. The results also might support the alternate harder-law approach of the DSA, which doesn’t require transposition into national law. There’s no reason to wait and see what national implementations will look like, and a more limited risk of differing national interpretations (though this could still happen). Moreover, the DSA at least attempts to focus on the largest and thus most “dangerous” platforms, though I have argued elsewhere that its targeting is still relatively poor.

Brieske and Peukert help explain why online content governance is so difficult: Not only are regulators dealing with conflicting and sometimes irreconcilable priorities (pay copyright owners, avoid overblocking) but their solutions have to be translated into working systems. Services aware that they can’t automate fair dealing are easily tempted into sticking with the policies and systems they already put into place. Since the objective of licensing everything except that which need not be licensed can’t be done on an automated, large-scale basis, there is little incentive to improve. That is not a happy lesson, but it is one worth heeding.

Download PDF
Cite as: Rebecca Tushnet, Best Laid Plans: The Challenges of Implementing Article 17, JOTWELL (October 23, 2023) (reviewing Jasmin Brieske & Alexander Peukert, Coming into Force, Not Coming into Effect? The Impact of the German Implementation of Art. 17 CDSM Directive on Selected Online Platforms, CREATe Working Paper, available at SSRN (Jan. 25, 2022)), https://cyber.jotwell.com/best-laid-plans-the-challenges-of-implementing-article-17/.