THE OPTIMAL OPT-IN OPTION: PROTECTING VULNERABLE CONSUMERS IN THE EXPANDING PRIVACY LANDSCAPE

THE OPTIMAL OPT-IN OPTION: PROTECTING VULNERABLE CONSUMERS IN THE EXPANDING PRIVACY LANDSCAPE

This Note addresses the ever-growing series of privacy laws being enacted throughout the United States and the danger that the “opt-out” data collection system poses to many populations. There is a disparity in the level of “digital literacy” throughout the United States, and as more consumer data privacy laws emerge and continue to replicate the existing legislation, that disparity deepens.

Patterns among who does and who does not opt out of data collection contribute to algorithmic bias. Access to consumer data can create discriminatory and unequal treatment, which may be exacerbated by disparities in participation in opt-out provisions, increasing the vulnerability of populations less aware of or educated about the potential dangers of data collection. It is crucial that the United States implement a more robust regulatory system regarding its opt-out provisions to protect those who are most vulnerable in the digital world.

The full text of this Note can be found by clicking the PDF link to the left.

Introduction

In May 2016, ProPublica found that risk scores used nationwide to predict whether a defendant will commit a crime in the future are biased against Black people. 1 Julia Angwin, Jeff Larson, Surya Mattu & Lauren Kirchner, Machine Bias, ProPublica (May 23, 2016), https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing [https://perma.cc/7QDL-SGA4]. In a study of 7,000 defendants, their risk scores, and their actual recidivism rates, “[w]hite defendants were mislabeled as low risk more often than [B]lack defendants.” 2 Id. For-profit companies like Equivant survey and collect data on defendants and then generate these biased risk scores. 3 Id. Equivant previously conducted business under the name “Northpointe” after a 2017 rebrand. Press Release, Equivant, CourtView Justice Solutions Inc., Constellation Justice Systems Inc., and Northpointe, Inc. Announce Company Rebrand to Equivant (Jan. 10, 2017), https://www.einnews.com/pr_news/361378637/courtview-justice-solutions-inc-constellation-justice-systems-inc-and-northpointe-inc-announce-company-rebrand-to-equivant [https://perma.cc/6TD4-QKXT]. Like most developers, their goal is to create a more efficient, productive system that prevents the introduction of human errors. “The trick, of course, is to make sure the computer gets it right.” 4 Angwin et al., supra note 1.

In 2019, Ziad Obermeyer—a professor at the University of California, Berkeley’s School of Public Health—and his team were looking into how algorithms inform healthcare management in large hospitals. 5 Heidi Ledford, Millions Affected by Racial Bias in Health-Care Algorithm, 574 Nature 608, 608 (2019). Their research revealed not only a problem in the healthcare industry but the tip of a systemically racist iceberg. An algorithm that was widely used by hospitals in the United States to help allocate healthcare to the patients visiting the hospital was guilty of “systematically discriminating against [B]lack people.” 6 Id. The data from the hospital Obermeyer and his team studied showed that the people who self-identified as Black “were generally assigned lower risk scores than equally sick white people.” 7 Id. Black patients, though equally sick, were wrongly considered to be in less urgent or immediate need of care than white patients. 8 See id. (“As a result [of discriminatorily assigned lower risk scores] . . . [B]lack people were less likely to be referred to the programmes that provide more-personalized care.”).

The information and data that are collected on people can be twisted and used in discriminatory ways. Now more than ever, “the amount and variety of data that is collected from individuals has increased exponen­tially, ranging from structured numeric data to unstructured text documents such as email, video, audio and financial transactions.” 9 Maddalena Favaretto, Eva De Clercq & Bernice Simone Elger, Big Data and Discrimination: Perils, Promises and Solutions. A Systematic Review, J. Big Data, Dec. 5, 2019, at 1, 2. Companies use, sell, and share this information. 10 See Sarah Lamdan, Defund the Police, and Defund Big Data Policing, Too, Jurist: Acad. Comment. (June 23, 2020), https://www.jurist.org/commentary/2020/06/sarah-lamdan-data-policing [https://perma.cc/8EV2-JCMG] (identifying Thomson Reuters and RELX as massive data analytics corporations engaging in the surveillance and sale of personal data). Further, law enforcement buys these data to build massive and discriminatory police surveillance networks. 11 Id. (“[T]oday’s policing infrastructure . . . spends millions of dollars on an invisi­ble, sprawling data surveillance industry[,] . . . form[ing] oppressive systems that discrimi­nate against communities of color, refugees, and migrants.”). All these personal datasets are summarized using a collection of methods identified by scholars as “Big Data analytics,” 12 Favaretto et al., supra note 9, at 2 (defining “Big Data analytics” as “the plethora of advanced digital techniques (e.g.[,] data mining, neural networks, deep learning, profiling, automatic decision making and scoring systems) designed to analyze large datasets with the aim of revealing patterns, trends and associations, related to human behavior”). and they can be used to inform companies and institutions on whether to approve a loan, grant parole, or deny a job application, among other things. 13 Id. With access to Big Data, machine learning comes in to help uncover consumer trends and patterns with the help of decisionmaking algorithms. 14 Chithrai Mani, How Is Big Data Analytics Using Machine Learning?, Forbes (Oct. 20, 2020), https://www.forbes.com/sites/forbestechcouncil/2020/10/20/how-is-big-data-analytics-using-machine-learning [https://perma.cc/9NFV-RXHQ]. Businesses find it helpful when these algorithms categorize and recognize patterns in the data that they can use. 15 Id. Although the concepts of information and patterns on their own give the impression of impartiality, bias and racism thrive off Big Data analysts sharing and selling data. 16 See infra section I.A.

What, then, is being done about this? Only recently has the United States embarked on the journey of building privacy regulations and data-protection laws to protect the people who use varied technologies, social media, and websites. 17 See infra sections I.B–.C. The United States has seen enormous transfor­mation in terms of privacy regulation and steps taken to combat the potential dangers inherent in a society that is interwoven with the online world. 18 See infra section I.D. States have been stitching together the first wave of defenses against privacy infringements on a state-by-state basis. 19 See id.

Five states have taken necessary steps to strengthen their data privacy laws. These states have created seemingly more robust and comprehensive legislation that establishes a standard for consumer privacy. This legis­lation is already being enforced in these five states: California, Colorado, Connecticut, Utah, and Virginia. 20 Andrew Folks, US State Privacy Legislation Tracker, Int’l Ass’n of Priv. Pros., https://iapp.org/resources/article/us-state-privacy-legislation-tracker [https://‌perma.cc/‌G7WL-8ADZ] (last updated Feb. 23, 2024) (presenting images and descriptions of state privacy legislation as it exists today, as well as when enacted laws will become effective and actually enforced). California led the way on these data privacy and consumer laws with the California Consumer Privacy Act (CCPA), and the other four followed suit, even using much of the same verbiage as the CCPA. 21 California Consumer Privacy Laws, Bloomberg L., https://‌pro.bloomberglaw.com/brief/‌the-far-reaching-implications-of-the-california-consumer-privacy-act-ccpa/ [https://perma.cc/XEK8-86FH] (last visited Jan. 12, 2023) (stating that the CCPA is the “first comprehensive consumer privacy legislation in the U.S.” and may serve as a model for other states). Much of this language addresses companies that collect data, mandating full disclosure of what data is being taken and whether it is being sold. 22 Thorin Klosowski, The State of Consumer Data Privacy Laws in the US (and Why It Matters), N.Y. Times: Wirecutter (Sept. 6, 2021), https://www.nytimes.com/wirecutter/blog/state-of-privacy-laws-in-us (on file with the Columbia Law Review) (explaining that “a company operating under these regulations must tell you if it’s selling your data”). Additionally, these laws mandate opt-out provisions in an effort to allow people to take further individual control over whether their data can be sold or accessed. 23 Id. (describing the “global opt out” requirement). Data and consumer privacy concerns are rapidly growing: At least thirty-five states and the District of Columbia introduced or considered almost two hundred consumer privacy bills in 2022. 24 Pam Greenberg, 2022 Consumer Privacy Legislation, Nat’l Conf. of State Legislatures, https://www.ncsl.org/research/telecommunications-and-information-technology/2022-consumer-privacy-legislation.aspx [https://perma.cc/4W9J-3CTY] (last updated June 10, 2022).

As these data privacy concerns and protections morph and transform so rapidly, companies and organizations are keeping a close eye on quickly evolving state regulations to stay on top of how they would need to respond to such shifts. Law enforcement has already begun using data-driven pre­dictive models to zero in on areas and communities likely to be involved in criminal activities. 25 See Johana Bhuiyan, LAPD Ended Predictive Policing Programs Amid Public Outcry. A New Effort Shares Many of Their Flaws, The Guardian (Nov. 8, 2021), https://‌
www.theguardian.com/us-news/2021/nov/07/lapd-predictive-policing‑surveillance-reform [https://perma.cc/H8A4-6N3R] (discussing how new Los Angeles Police Department predictive policing programs “bear[] a striking resemblance” to past data-driven programs that came under immense scrutiny for disproportionately leading to overpolicing in Black and brown communities).
Many of these data, which reflect preexisting biased arrest patterns, perpetuate the problem. Some policing in departments such as the Los Angeles Police Department (LAPD) has moved to “Big Data Policing,” also called “data-informed community-focused policing” (DICFP). 26 Id.; see also Sarah Brayne, Dye in the Cracks: The Limits of Legal Frameworks Governing Police Use of Big Data, 65 St. Louis U. L.J. 823, 826–28 (2021) (describing how the LAPD uses big data to surveil both the general population and suspects). Under this policy, law enforcement even coordinates directly with tech firms to surveil a person’s presence online (social media postings), to investigate crimes, and to monitor what it would deem potential threats. 27 Sam Levin, Revealed: LAPD Officers Told to Collect Social Media Data on Every Civilian They Stop, Guardian (Sept. 8, 2021), https://www.theguardian.com/us-news/2021/sep/08/revealed-los-angeles-police-officers-gathering-social-media [https://perma.cc/4M5B-NL5D] (explaining how LAPD officers were told it was critical to collect civilian social media data for use in “investigations, arrests, and prosecutions”) (quoting Memorandum from Michel R. Moore, Chief of Police, L.A. Police Dep’t, to All Department Personnel, L.A. Police Dep’t 1 (July 22, 2020), https://www.brennancenter.org/sites/default/files/2021-09/I.%20Beck%20FI%20Memo.pdf [https://perma.cc/LWZ9-PSW2])). Before many of the consumer privacy regulations, tech companies were under little to no obligation to inform their users about how they were sharing their users’ data or the ways their users’ actions were being monitored. 28 See Alison Divis, How the CCPA Benefits Consumers and Business Owners,
Pac. Data Integrators, https://www.pacificdataintegrators.com/insights/ccpa-benefits [https://perma.cc/MYF3-WA7Q] (last visited Jan. 15, 2023) (explaining how the CCPA was set to change the privacy landscape).

The emergence of the opt-out provision, specifically, returned some degree of agency to consumers over their privacy permissions and whether they allow a company to share or sell their data. In 2012, a story emerged detailing how Target was able to predict people’s pregnancies before they had so much as told their families. 29 Charles Duhigg, How Companies Learn Your Secrets, N.Y. Times (Feb. 16, 2012), https://www.nytimes.com/2012/02/19/magazine/shopping-habits.html (on file with the Columbia Law Review). These predictions were possible thanks to a statistician, “predictive analytics,” and unfettered access to con­sumer data. 30 See id. (identifying these three factors as central reasons for Target’s successful predictions). Consumer data privacy and the right to opt out emerged after such events, and those who seek it out may exercise some agency to avoid similar situations. At least, that is the perception. Virtually all the enhanced-privacy and consumer protection regulations relevant here were passed within the previous three years, and countless more are inevitably on the horizon. 31 See Greenberg, supra note 24 (providing an overview of the progression of consumer privacy legislation). In the flurry of new laws both passed and upcoming, no one has thoroughly evaluated how effective these regulations are at avoid­ing these potential avenues of racism and bias. The opt-out provisions found within each of the new regulations contain “antidiscrimination” sections, but the language therein is thin and leaves many questions unan­swered. 32 See infra section I.D.3. Is allowing for opt-out provisions and providing antidiscrimina­tion language really benefitting diverse and marginalized communities? Or is it merely cementing the position of surveillance and tracking in our society while just making it transparently known that this is the status quo?

These new privacy laws’ variety and novelty raise questions about their effectiveness and the impact that they actually have. Since machine learning of people’s behaviors and preferences leads to wide-scale algorithmic bias, certain consumers opting out has also impacted the machine learning’s algorithmic process. That is to say, there is algorithmic bias based on who does opt out versus who does not. Access to consumer data can create discriminatory and unequal treatment, which may be exacerbated by disparities in participation in opt-out provisions, increasing the vulnerability of populations less aware of or less educated about the potential dangers. It is crucial that the United States implement a more robust regulatory system regarding its opt-out provisions to protect those who are most vulnerable in the digital world.

This Note starts in Part I with a discussion of the history of discrimi­nation enabled by a lack of data privacy. Part I then turns to state-specific privacy regulations, providing a general overview of the key rights found in these regulations and discussing the regulations’ strengths. Part II looks at the laws as they are applied and breaks down the ways that the regula­tions may generate discrimination based on who decides to opt out. Part III addresses potential remedies in the form of a national privacy frame­work; mandated opt-in provisions in place of opt-out provisions; and altered presentation of the existing opt-out website pop-ups to make them both easy to understand and unavoidable by consumers.