The Journal of Things We Like (Lots)
Select Page
Amanda Levendowski, Resisting Face Surveillance with Copyright Law, 100 N. C. L. Rev. __ (forthcoming, 2022), available at SSRN.

One prevailing feature of technological development is that it is not sui generis. Rather, new technologies often mirror or reflect societal anxieties and prejudices. This is true for surveillance technologies, including those used for facial recognition. Although the practice of facial recognition might be positioned as a type of convincing evidence useful for identifying an individual, the fact remains that racial and gender biases can limit its efficacy.. Scholars such as as Timnit Gebru and Joy Buolawmini have shown through empirical evidence that facial recognition systems, which are often trained on limited data, display stunningly biased inaccuracy. The two AI researchers reviewed the performance of facial analysis algorithms across four “intersectional subgroups” of males or females featuring lighter or darker skin. They made the startling discoveries that the algorithms performed better when determining the gender of men as opposed to women, and that, darker faces were most likely to be misidentified.

In her path-breaking article, Resisting Face Surveillance with Copyright Law, Professor Amanda Levendowski identifies these harms and others, and advocates for the proactive use of copyright infringement suits to curb the use of photographs as part of automated facial surveillance systems. First, Levendowski illustrates why the greater misidentification of darker faces by algorithmic systems is a problem of great concern. Levendowski shares the story of Robert Julian Borchak Williams who was placed under arrest in front of his home and in view of his family. A surveillance photograph had been used to algorithmically identify him.. However, once the photograph was compared to the actual person of Mr. Williams, it was obvious that he had been misidentified. The only explanation Mr. Williams got was, “The computer must have gotten it wrong.” The sad reality is that Williams’ case is not unique, there are many more stories of Black men being wrongfully arrested based on misidentification by AI systems. Given the glacial creep of federal legislation to regulate face surveillance, Levendowski advocates for turning to the copyright tools she believes we already have.

Facial recognition systems have proliferated in the past few years. For example, in 2020, an individual taking the Bar exam in New York related how he was directed to “sit directly in front of a lightning source such as a lamp” so the face recognition software could recognize him as present. I have written about and against the troubling use of facial recognition by automated hiring programs. Evan Sellinger and Woodrow Hartzog have written about the extensive use of facial surveillance in immigration and law enforcement and have called for a total ban. Although some jurisdictions in the United States have heeded the call to ban the use of facial recognition systems by law enforcement, many others have not, and there is currently no federal legislation banning or even regulating the use of facial recognition systems.

Resisting Face Surveillance with Copyright Law is innovative in its approach of deploying copyright law as a sword against the use of automated facial recognition. As Levendowski argues “Face Surveillance is animated by deep-rooted demographic and deployment biases that endanger marginalized communities and threaten the privacy of all.” Deploying copyright litigation to stem the use of facial recognition holds great potential for success because as Levendowski notes, corporations like Clearview AI are trawling selfies and profile pictures online to compose a gargantuan face-recognition database for law enforcement and other purposes. Levendowski notes that Clearview AI has copied about three billion photographs without the knowledge or consent of the copyright holders or even the authorization of the social media companies that host those photographs. Levendowski’s article is one answer to what can be done with the laws we have now to curtail the use of face surveillance.

Levendowski notes that one common defense of scraping — to invoke the First Amendment — would not be viable against copyright claims. Levendowski recounts the Court’s statement in Eldred v. Ashcroft that “copyright law contains built-in First Amendment Accommodations”“ which “strike a definitional balance between the First Amendment and copyright law by permitting free communication of facts while still protecting an author’s expression.” Thus, Levendowski concludes, copyright infringement lawsuits could serve as “a significant deterrent to face surveillance” particularly given the hair-raising statutory damages of $150,000 for each case of willful infringement.

However, as Levendowski notes, there are several hurdles to the successful use of a copyright infringement lawsuit against face surveillance. For one, there is the affirmative defense of fair use. Levendowski concedes that the Google v. Oracle decision in 2021, which concluded that Google made a fair use when it copied interface definitions from Java for use in Android, has changed the fair use landscape and may make it less likely for copyright infringement suits against face surveillance systems to prevail. Yet, as Levendowski explains. the use of profile pictures may still fall outside of fair use protections because they are more likely to fail the four-factor test. She argues that unlike search engines which fairly “use” works in order to point the public to them, facial recognition algorithms copy faces in order to identify faces. That is, the “heart” of the copied work — a person’s face — is the part that is copied by the face surveillance systems, and the use is less transformative than a search engine’s use. Levendowski also draws on recent case law to suggest that courts will be less likely to find the for-profit subscription model deployed by many facial recognition companies to be fair use, compared to the free-to-the-public model used by most search engines.

Levendowski deploys Google v. Oracle and other key fair use cases to assess each fair use factor. First, she notes that surveillance companies are not using the pictures for a new purpose, their reason for using the photographs are the same as profiles pictures: particularized identification. Yet, Levendowski argues, even absent a new purpose, such use may still be somewhat transformative, favoring face surveillance companies. She then also concludes that that the nature of the work is creative and that the use features the photographs’ faces—the “heart” of profile pictures, creating unfavorable outcomes for these companies under the middle two factors. Analyzing the final factor, Levendowski concludes that using these photographs harms the unique licensing market for profile pictures, and that this dictates a ruling against fair use.

All in all, although some might not agree with her fair use analysis, I find Levendowski’s approach to be an ingenious approach to lawyering in the digital age. If I have any reservations, it is whether this information might introduce a new tactic for face surveillance corporations —to purchase or license the copyrights in the photographs they use. Such a tactic would be facilitated by social media or other platforms that require users to give up the copyrights to any photos they post. This indicates that there might yet be more regulation needed to address face surveillance. But in the meantime, Levendowski’s lawyering represents a creative approach to the problem of face surveillance.

Download PDF
Cite as: Ifeoma Ajunwa, Confronting Surveillance, JOTWELL (May 12, 2022) (reviewing Amanda Levendowski, Resisting Face Surveillance with Copyright Law, 100 N. C. L. Rev. __ (forthcoming, 2022), available at SSRN),