KEEPING CONSUMERS IN THE DARK: ADDRESSING “NAGGING” CONCERNS AND INJURY

KEEPING CONSUMERS IN THE DARK: ADDRESSING “NAGGING” CONCERNS AND INJURY

In the digital context, companies often use “dishonest design”—commonly known as “dark patterns”—to trick or push consumers into doing things they wouldn’t necessarily have done otherwise. Existing scholarship has focused on developing a taxonomy and definitions for different categories of dark patterns, conducting empirical research to better understand the effectiveness of dark patterns, and broadly survey­ing the legal and regulatory landscape for theories, existing and new, through which to curb these practices. This Note offers a deep dive into one category of dark patterns—“nagging”—and the unique legal issues that the practice raises. While the FTC has started to use its section 5 “unfair or deceptive” authority to combat some other types of dark pat­terns, particularly practices that mislead consumers, nagging practices are especially elusive—but just as insidious as the more commonly dis­cussed categories of dark patterns. This Note identifies the direct and indirect harms that nagging poses to consumers, argues for the regulation of the nagging category of dark patterns, and proposes a “do not nag” feature, modeled after the federal “do not call” list, as a solution.

The full text of this Note can be found by clicking the PDF link to the left.

Introduction

In 2015, LinkedIn settled for $13 million with users who, after signing up for LinkedIn’s “Add Connections” feature, were dismayed to learn that LinkedIn had sent unwanted emails to their address book contacts on their behalf. 1 Ahiza Garcia, LinkedIn to Pay $13 Million for Unwanted Emails, Lawyers Could Get $3.3 Million, CNN Money (Oct. 3, 2015), https://‌money.cnn.com/‌‌2015/10/03/‌news/‌linkedin-settles-lawsuit-emails/index.html [https://perma.cc/8JNB-BPZC]. These LinkedIn users had agreed to send an initial email inviting their professional contacts to connect, but what they didn’t know was that LinkedIn would send up to two reminder emails to each contact. 2 Id. Contacts on the receiving end of these reminder emails had virtually no way to opt out of reminders. 3 John Brownlee, After Lawsuit Settlement, LinkedIn’s Dishonest Design Is Now a $13 Million Problem, Fast Co. (Oct. 5, 2015), https://‌www.fastcompany.com/‌‌3051906/‌‌after-lawsuit-settlement-linkedins-dishonest-design-is-now-a-13-million-problem [https://perma.cc/ZY5T-8XVL]. This is one of the more notorious cases of companies using “dishonest design”—also known as “dark patterns”—to trick or push consumers into “doing things they don’t really want to do.” 4 Id. But even though most dark patterns don’t make headlines or result in multimillion dollar settlements, they significantly impact consumers’ online experiences because they are everywhere. 5 Eric Ravenscraft, How to Spot—and Avoid—Dark Patterns on the Web, WIRED (July 29, 2020), https://www.wired.com/‌‌story/‌‌how-to-spot-avoid-dark-patterns/‌ [https://perma.cc/T8R2-B5D3].

Many of the basic tactics and strategies underlying dark patterns are neither new nor unique to the online context. Before the advent of the internet, salespeople and marketing professionals had long wielded per­suasion, coercion, and even manipulation with great effect. 6 See Jamie Luguri & Lior Jacob Strahilevitz, Shining a Light on Dark Patterns, 13 J. Legal Analysis 43, 45–46 (2021) (listing door-to-door sales and transactions involving funeral services, telemarketing, and home equity loans as examples of these high-pressure, some­times questionable sales tactics); see also Fed. Trade Comm’n v. Age of Learning, Inc., No. 2:20-cv-7996, at 1 (C.D. Cal. Sept. 1, 2020) (statement of Comm’r Rohit Chopra, Regarding Dark Patterns in the Matter of Age of Learning, Inc.), https://www.ftc.gov/‌system/‌files/‌documents/‌public_statements/‌1579927/‌172_3086_‌abcmouse_-_rchopra_‌statement.pdf [https://perma.cc/V9X5-CWFA] [hereinafter Age of Learning, Statement of FTC Commissioner] (recognizing that dark patterns are “the online successor to decades of dirty dealing in direct mail marketing”). What makes these practices particularly concerning in the digital context, however, is their scale: Online platforms can reach millions of consumers within sec­onds through targeted advertisements, and companies can use automated tools to spam consumers with marketing emails. 7 Ryan Calo, Digital Market Manipulation, 82 Geo. Wash. L. Rev. 995, 1021 (2014); Justin (Gus) Hurwitz, Designing a Pattern, Darkly, 22 N.C. J.L. & Tech. 57, 67–68 (2020) (suggesting that what is unique about dark patterns is that, in the online context, “[t]here is practically no limit to design choices, and those design choices can be changed, tweaked, updated, and targeted with ease”).

Companies’ incentives are not always aligned with consumers’ best interests or preferences, and design is a potent tool for companies 8 In the digital context, through A/B testing, companies now have the ability to con­duct experiments on consumers to learn how changes in user interface or product design can affect consumers’ behavior. See Brian Christian, The A/B Test: Inside the Technology That’s Changing the Rules of Business, WIRED (Apr. 25, 2012), https://www.wired.com/‌2012/04/ff-abtesting/ [https://perma.cc/J2Y9-KJXL] (“A/B [testing] allows seemingly subjective questions of design—color, layout, image selection, text—to become incontrovertible matters of data-driven social science.”); Justin Elliott & Paul Kiel, The TurboTax Trap: Inside TurboTax’s 20-Year Fight to Stop Americans From Filing Their Taxes for Free, ProPublica (Oct. 17, 2019), https://‌www.propublica.org/‌‌article/‌inside-turbotax-20-year-fight-to-stop-americans-from-filing-their-taxes-for-free [https://perma.cc/R9YQ-EFVP] (describing how the company “conducts rigorous user testing” to make design choices that “maximize how many customers pay, regardless if they are eligible for the free product,” and “[d]ark patterns are something that are spoken of with pride and encouraged” in design meetings). to shape consumers’ digital experiences and influence their behavior. 9 Woodrow Hartzog, Privacy’s Blueprint: The Battle to Control the Design of New Technologies 27 (2018) (“Through signals, design helps define our relationships and our risk calculus when dealing with others. Design affects our expectations about how things work and the context within which we are acting.”). For example, companies that mediate consumers’ online social interactions have “overwhelming incentives to design technologies in a way that max­imizes the collection, use, and disclosure of personal information.” 10 Id. at 5. As Professor Woodrow Hartzog and others have noted, “The predominant Internet business model is built on collecting as much user data as possible and selling it or using it to target and persuade users . . . . Design can be leveraged in subtle ways to get more, more, more.” Id. Some scholars have argued that design should play a bigger role in privacy  law,  which  has  tended  to  focus  more  on  data  collection,  use,  and distribution. 11 Id. at 12 (“Most students of privacy policy understand privacy by design to mean a proactive ex ante approach to considering and protecting privacy . . . . The opposite of pri­vacy by design is responding to a privacy harm after it has occurred.”). The more enforcers overlook design, the more room companies have to use design to run around privacy and other consumer protection laws. See id. at 57 (“Design tricks like manipulative and confus­ing website design or evasive surveillance devices can be technically legal yet leave people ignorant, deceived, confused, and hurt.”); see also Ari Ezra Waldman, Privacy, Notice, and Design, 21 Stan. Tech. L. Rev. 74, 107–08 (2018) (explaining that “users consider design when making privacy choices,” not just the substance of privacy policies). After all, design conveys signals to consumers, affects the transaction costs of their online activities, 12 See Lauren E. Willis, Why Not Privacy by Default?, 29 Berkeley Tech. L.J. 61, 110–11 (2014) (describing how firms’ design and choice-architecture decisions alter or frame the consumer’s “decision environment” by affecting transaction barriers). and affects their perceptions. 13 Hartzog, supra note 9, at 42. As Professor Woodrow Hartzog remarks, “Design is everything . . . . [D]esign is power.” 14 Id. at 21, 23; see also Calo, supra note 7, at 1004 (noting that a consequence of consumer mediation is that “firms can and do design every aspect of the interaction with the consumer” (emphasis added)); Waldman, supra note 11, at 78–79 (“[D]esign configures users, limiting our freedom in ways predetermined by the designer . . . . [W]ebsite design can discourage us from reading privacy notices . . . or coerce us into mismanaging our pri­vacy contrary to our true intentions.”).

Scholarship on dark patterns has focused on developing a taxonomy and definitions for different types of dark patterns, conducting empirical research to better understand the effectiveness of dark patterns, and broadly surveying the legal and regulatory landscape for theories, existing and new, through which to curb these practices—categories of dark pat­terns ranging from the merely troubling to the clearly manipulative. 15 See Luguri & Strahilevitz, supra note 6, at 45 (explaining that existing research focuses on the taxonomy and “growing prevalence of dark pattern techniques”). But see Hurwitz, supra note 7, at 104–05 (arguing that we should first consider existing statutory authority before “overlying new . . . layers to the regulatory fabric,” and that the fact that many firms use design for questionable purposes alone “does not demand legislative or regulatory innovation in response . . . [because] the market is an effective check on these practices”). Scholars and researchers have already identified and recognized “nag­ging”—online design practices that create persistent interactions with users and may eventually compel them to do things that they wouldn’t necessarily have done—as one of many categories of dark patterns. This Note contributes to existing legal scholarship by offering a deep dive into the nagging category of dark patterns, particularly the unique legal issues that the practice raises.

This Note argues for the regulation of the nagging category of dark patterns and proposes a “do not nag” feature, modeled after the federal “do not call” list, as a solution. While the FTC has started to use its section 5 “unfair or deceptive” authority to combat some types of dark patterns, particularly practices that mislead consumers, nagging practices are espe­cially elusive—but just as insidious as the more commonly discussed dark patterns. Part I of this Note defines the nagging category of dark patterns and argues that nagging practices are harmful to consumers and warrant timely intervention. In particular, section I.B identifies both the direct and indirect harms that nagging poses to consumers. Part II provides an over­view of recent legislative and regulatory responses to dark patterns more generally and explains why existing consumer protection legal frame­works, though likely capable of addressing most other categories of dark patterns, will be ineffective at addressing nagging. Section III.A proposes a “do not nag” feature as a solution to the unique nagging problem, draw­ing on lessons learned from the “do not call” registry and the (ultimately unsuccessful) “do not track” movement. Section III.B further explores how a “do not nag” feature will survive First Amendment scrutiny and engages with other critiques—that it places too heavy of a burden on con­sumers and could have unintended consequences—that this solution may face.