Introduction
Fentanyl poisoning is now the leading cause of death among Americans ages eighteen to forty-five, surpassing traffic accidents, suicide, and COVID-19.
Electronic communications and social media have played an outsized role in the ongoing opioid epidemic, leading the Drug Enforcement Administration to take aim at technology companies in recent years.
In 2021, the DEA issued a public warning about the growing number of fentanyl-laced counterfeit pills being sold online and blamed social media companies for failing to protect their users.
Between May 2022 and May 2023, the DEA conducted more than 1,400 investigations resulting in 3,337 arrests and the seizure of nearly 193 million deadly doses of fentanyl.
Over seventy percent of those investigations involved social media sites and encrypted communications platforms like Facebook, Instagram, Signal, Snapchat, Telegram, TikTok, WhatsApp, Wickr, and Wire.
But these efforts have been insufficient, according to a bipartisan group of congressmembers, and the fentanyl crisis has worsened as “federal agencies have not had access to the necessary data to intervene.”
To address the inaccessibility of data held by third parties, Senators Roger Marshall and Jeanne Shaheen introduced in March 2023 the Cooper Davis Act, which would require tech companies to report evidence of illicit fentanyl, methamphetamine, and counterfeit drug crimes occurring on their platforms to the DEA.
In July 2024, Representatives Angie Craig and Mariannette Miller-Meeks introduced the Cooper Davis and Devin Norring Act, which mirrors the Senate bill, in the House.
The proposed legislation would, for the first time, require electronic communications service providers and remote computing services (“providers”
) to report suspected criminal activity by their users directly to federal law enforcement.
The Senate Judiciary Committee approved the Cooper Davis Act in July 2023.
The bill expired in January 2025.
Providers use a variety of nonhuman moderation tools to detect content that violates their terms of service, such as drug transactions, spam, hate speech, and child sexual abuse material (CSAM).
Federal law requires providers to report evidence of CSAM to the National Center for Missing & Exploited Children (NCMEC), but providers are not statutorily obligated to report any other kind of suspected illegal activity.
The Cooper Davis Act is modeled after the federal statute requiring providers to report CSAM: the PROTECT Our Children Act of 2008 (PROTECT Act). Both laws aim to make technology companies play a more proactive role in aiding law enforcement and public safety efforts.
While courts have upheld the constitutionality of providers detecting and reporting CSAM pursuant to the PROTECT Act,
the proposed bill targets a qualitatively different kind of crime—one highly dependent on context.
This Note argues that by requiring providers to report directly to the government and prohibiting deliberate blindness to violations, the Cooper Davis Act would incentivize providers to conduct large-scale automated searches for drug-related activity, raising novel questions about the Fourth Amendment’s applicability to mandatory reporting laws for non-CSAM crimes.
This Note examines the constitutional problems raised by the Cooper Davis Act and, more broadly, legislation requiring providers to report evidence of illegal activity based on automated computer searches of their users’ communications. Part I introduces the proposed bill, its model statute, and Fourth Amendment issues stemming from providers’ CSAM reporting requirement, including a circuit split over the private search doctrine’s application in online CSAM cases. Part II discusses the differences between automated searches for CSAM and drug-related activity and outlines the novel Fourth Amendment questions raised by the Cooper Davis Act. Part III then explores these issues, concluding that courts would likely treat providers as private parties under the bill. Accordingly, Part III argues that courts should adopt a narrow private search exception to the Fourth Amendment, which best balances users’ privacy rights against the government’s public safety interests. This approach would also resolve the circuit split in online CSAM cases and provide clear guidance to courts as they confront algorithmic search methods in the future.