The Journal of Things We Like (Lots)
Select Page
Daniel Susser, Beate Roessler & Helen Nissenbaum, Online Manipulation: Hidden Influences in a Digital World, available at SSRN.

Congress has been scrambling to address the public’s widespread and growing unease about problems of privacy and power on information platforms, racing to act before the California Consumer Privacy Act becomes operative in 2020. Although the moment seems to demand practical and concrete solutions, legislators and advocates should pay close attention to a very timely and useful work by a set of philosophers. In a working paper entitled Online Manipulation: Hidden Influences in a Digital World, three philosophers–Daniel Susser, Beate Roessler, and Helen Nissenbaum–offer a rich and nuanced meditation on the nature of “manipulation” online. This article might provide the conceptual clarity required for the broad and sweeping kind of new law we need to fix much of what ails us. Although the article is theoretical, it could lead to some practical payoffs.

The article’s most important contribution is the deep dive it provides into the meaning of the manipulation, a harm separate and distinct from other harms more often featured in today’s technology policy discourse. Powerful players routinely deprive us of an opportunity for self-authorship over our own actions. Advertisers manipulate us into buying what we don’t need; platforms manipulate us into being “engaged” when we would rather be “enlightened” or “provoked” or “offline”; and political operatives manipulate us into voting against our interests. Taken together, these incursions into individual autonomy feed societal control, power imbalances, and political turmoil. The article builds on the work of many others, including Tal Zarsky, Ryan Calo (in an article that has received well-deserved praise from Zarsky in these pages), and Frank Pasquale, who have all written about the special problems of manipulation online.

The heart of the paper is an extended exploration into what it means to manipulate and how it differs from other forms of influence both neutral (persuasion) and malign (coercion). The philosophers focus on the hidden nature of manipulation. If I bribe or threaten or present new evidence to influence your decision-making, you cannot characterize what I am doing as manipulation, according to their definition, because my moves are visible in plain sight. I might be able to force you to take the decision I desire, which might amount to problematic coercion, but I have not manipulated you.

This insistence on hidden action might not square with our linguistic intuitions. We indeed might feel manipulated by someone acting in plain sight, and the authors are not trying to argue against these intuitions. Instead, they claim that by limiting our definition of manipulation to hidden action, we can clear up conceptual murkiness on the periphery of how we define and discuss different forms of discreditable influence. This is very useful ground clearing, helping manipulation stand on its own as a category of influence we might try to attack through regulation or technological redesign.

The piece convincingly links increased fears of manipulation, thus defined, to the current and likely future state of technology and the power of information platforms in particular. The pervasive surveillance of today’s information technology gives would-be manipulators access to a rich trove–Dan Solove’s digital dossiers and Danielle Citron’s reservoirs of danger—about each of us, which they can buy and use to personalize their manipulations. Knowing the secret manipulation formula for each individual, they can then use the “dynamic, interactive, intrusive, and personalized choice architectures” of platforms to give rise to what Karen Yeung calls “hypernudging.” Online tools hide such behavior, in the way they are designed to recede into the background; in one of the more evocative analogies in the paper, the authors argue that information technology operates more like eyeglasses than magnifying glasses, because we forget about them when we are using them. “A determined manipulator could not dream up a better infrastructure through which to carry out his plans” than today’s technological ecosystem, they conclude.

Having crafted their own definition of manipulation, and after connecting it to modern technology, the authors turn last to theories of harm. They focus on harm to autonomy, on the way manipulation undermines the ability of the manipulated “to act for reasons of their own.” We are treated like puppets by puppet masters pulling our strings; “we feel played.

The cumulative effects of individual manipulations harm society writ large, posing “threats to collective self-government.” Consider the bolder claims of psychographic targeting made by the people at Cambridge Analytica before the last election, which if true suggest that “democracy itself is called into question” by online manipulation.

If Congress wants to enact a law prohibiting manipulative practices, this article offers some useful definitions: a manipulative practice is “a strategy that a reasonable person should expect to result in manipulation,” and manipulation is defined as “the covert subversion of an individual’s decision making.” Congress would be wise to enact this kind of law, perhaps adding it as a third prohibited act alongside deception and unfairness in section five of the FTC Act.

In addition, Congress could breathe new life into notice-and-choice regimes. Currently, we are asked to take for granted that users “consent” to the extensive collection, use, and sharing of information about them because they clicked “I agree” to a term-of-service pop-up window they once saw back in the mists of time. Were we to scrutinize the design of these pop-ups, assessing whether online services have used manipulative practices to coax users to “agree,” we might recognize the fiction of consent for what it really is. We should implicitly read or explicitly build into every privacy law’s consent defense a “no dark patterns” proviso, to use the phrase for manipulative consent interfaces by scholars like Woody Hartzog.

Finally, although these authors ground their work in the concept of autonomy, an unmeasurable concept not well-loved by economists, their argument could resonate in the god-forsaken, economics-drenched tech policy landscape we are cursed to inhabit. Manipulation, as they have defined it, exacerbates information asymmetry, interfering with an individual’s capacity to act according to preferences, resulting in market failure. A behavioral advertiser with a digital dossier “interferes with an agent’s decision-making process as they deliberate over what to buy. Worse yet, they may be enticed to buy even when such deliberation would weigh against buying anything at all.”

In fact, the authors go to lengths to explore how harmful manipulation interacts with the concept of nudges. Some nudges should count as manipulation, when their designs and mechanisms are hidden, even if they bring about positive behavioral change. The architects of the theory of nudges might even embrace this conclusion. The article quotes liberally from Cass Sunstein, who has explored the ethics of government-imposed nudges, acknowledging their sometimes manipulative quality. The article resonates with recent ruminations by Richard Thaler, who has coined a new term, “sludges,” the negative mirror image of positive nudges. These fathers of nudges are finally cottoning on to what privacy scholars have been writing about for years: at least online, the negative sludges we encounter seem to outnumber the positive nudges, with the gap widening every day.

We have a new target in our sights, whether we call them manipulative practices, dark patterns, or sludges: the technological tools and tricks that powerful information players use to treat us like their puppets and cause us to act against our own self-interest. By lending precision to the meaning of manipulation, this article can help us meet the challenge of many of the seemingly impossible problems before us.

Download PDF
Cite as: Paul Ohm, Manipulation, Dark Patterns, and Evil Nudges, JOTWELL (May 22, 2019) (reviewing Daniel Susser, Beate Roessler & Helen Nissenbaum, Online Manipulation: Hidden Influences in a Digital World, available at SSRN), https://cyber.jotwell.com/manipulation-dark-patterns-and-evil-nudges/.