Digital Battlefield: US Survivors of Online Abuse Failed by Law & Tech
- 7.5 trillion: Combined market value of the 'Big Five' tech companies (Alphabet, Amazon, Apple, Meta, Microsoft) in 2020
- 45 states: Number of states with updated laws addressing AI-generated child sexual abuse material (CSAM), but lacking protections for adults against deepfakes and nonconsensual computer-edited materials
- 100%: Survivors in the study who failed to have abusive material fully removed from platforms
Experts conclude that the U.S. legal system and major tech companies are failing survivors of online sexual exploitation and abuse (OSEA), leaving them without adequate protections or justice due to outdated laws, inconsistent corporate policies, and systemic gaps in accountability.
Digital Battlefield: US Survivors of Online Abuse Failed by Law & Tech
NEW YORK, NY – January 28, 2026 – Survivors of online sexual exploitation and abuse (OSEA) in the United States are being systematically failed by a broken legal system and the powerful technology companies whose platforms facilitate the abuse, a groundbreaking new report has found. The analysis, released by Equality Now and the Sexual Violence Prevention Association (SVPA), paints a grim picture of a digital landscape where victims are left to navigate a labyrinth of outdated laws and indifferent corporate policies, often alone and at great personal cost.
Drawing on the harrowing lived experiences of survivors and expert legal analysis, the report, Online sexual exploitation and abuse in the United States, exposes critical gaps in legal protections and widespread failures by the criminal justice system and major tech corporations. The findings reveal that for those whose most intimate moments are weaponized against them online, the path to justice is not just difficult—it is often nonexistent.
A Broken Legal Shield
The core of the problem lies in a fractured and inconsistent legal framework. Protections against OSEA in the U.S. are split between federal and state systems, creating what the report describes as a “confusing patchwork of laws” that complicates cases and creates dangerous loopholes. Federal laws covering tech-facilitated abuse are not comprehensive, leaving states to fill the void with a jumble of statutes that vary wildly in scope and severity. A survivor’s access to justice can depend entirely on their zip code.
This legal disarray is particularly detrimental given the borderless nature of online abuse, which often involves multiple platforms and perpetrators in different jurisdictions. Survivors are left confused about which laws apply and which authorities have the power to act. The report highlights a concerning lack of laws addressing OSEA involving adult victims that span international borders.
The legislative lag is even more pronounced with the rise of new technologies. While 45 states have updated laws to address AI-generated child sexual abuse material (CSAM), protections for adults against so-called “deepfakes” and other nonconsensual computer-edited materials are dangerously behind. Penalties for creating and distributing these materials are inconsistent, ranging from a misdemeanor to a felony, failing to reflect the profound harm they cause.
“US laws have failed to keep pace with the realities of tech-facilitated sexual abuse, and survivors are paying the price,” explains Anastasia Law of Equality Now. “With no US federal statute requiring tech companies to ensure user safety or transparent reporting systems, survivors must navigate outdated laws, inconsistent responses, and repeated obstacles when trying to take down abusive material or hold perpetrators accountable.”
The Unaccountable Titans of Tech
While the legal system flounders, the technology companies that dominate the digital world operate with minimal legal obligation to protect their users. The report points to the “Big Five”—Alphabet (Google), Amazon, Apple, Meta, and Microsoft—as corporations with a combined market value of $7.5 trillion in 2020, giving them unprecedented power to shape global safety standards. Yet, no federal statute expressly requires them to maintain user safety.
For survivors, this lack of accountability translates into a nightmare of ineffective reporting systems and endless re-traumatization. Every survivor interviewed for the study suffered from the repeated reposting of their abusive material across multiple platforms, and none succeeded in having it removed entirely. They described a grueling, self-driven process of monitoring digital spaces and filing repeated “takedown requests,” often daily, only to wait months or years for a response—if one came at all.
One survivor, Izzy, had her Snapchat account hacked, and intimate images were sold to pornography websites. Her family was blackmailed. When she turned to the platform for help, the response was crushing. “Within their community guidelines, they say you’re not supposed to take any sexually explicit pictures of yourself, so if anything does happen to you, that’s your fault,” Izzy recalled. “It genuinely made me sick to my stomach how dismissive they were!”
This experience is not an anomaly. Survivors reported that platforms were often slow to act, and in some cases, the act of content moderation itself erased critical evidence needed for legal investigations, further undermining their quest for justice. The burden of proof, evidence preservation, and constant vigilance falls squarely on the shoulders of the victim.
The Human Cost of Systemic Failure
Beyond the screen, the impact of OSEA is devastating and multifaceted. The report underscores that the harm is not merely digital; it manifests in severe emotional, financial, and even physical trauma. All survivors who formally reported their abuse found the experience with the criminal justice system to be overwhelmingly negative, encountering victim-blaming and a profound lack of knowledge from law enforcement, prosecutors, and even victim advocates.
Samantha, who had a video of her rape posted online, described the ongoing horror. “It’s one thing that the attack happened, but then, when it was shared to be rewatched over and over again, and I had no control over how far it was reaching… It emotionally was just horrifying.” This sense of perpetual violation and loss of control led four participants in the study to have suicidal thoughts.
The financial toll is another crushing burden. Survivors reported spending thousands on legal services and mental health counseling. Some, like Izzy, now pay $1,000 a month to a private company just to try and scrub the abusive content from the internet. The harm bleeds into their professional lives, with three interviewees losing their jobs and others facing harassment on professional networking sites like LinkedIn.
Survivors are often forced to become their own investigators and legal experts, educating officials on relevant statutes and coordinating between different agencies and platforms. This systemic failure to provide a trauma-informed response only deepens the wounds of the initial abuse, leaving victims feeling betrayed and abandoned by the very systems meant to protect them.
A Call for a Survivor-Centered Future
In the face of these systemic failures, the report issues an urgent call for reform, centered on the voices and needs of survivors. The recommendations aim to reimagine digital safety by demanding accountability from both government and corporate giants.
“Online sexual exploitation and abuse is a form of systemic sexual violence rooted in misogyny, racism, and other intersecting oppressions,” states Katie Knick from the Sexual Violence Prevention Association. “While technology shapes how the harm occurs, prevention depends on dismantling rape culture and reducing power imbalances through education, policy reform, and institutional accountability.”
The path forward, according to the report, requires creating survivor-centered systems. This includes ensuring free legal representation, access to trauma-informed mental health care, and specialized training for all criminal justice professionals. Critically, it demands the creation of clear, effective, and transparent pathways for reporting and removing abusive material from online platforms.
Advocates are pushing lawmakers to strengthen both state and federal laws with clear policies governing consent and the online distribution of sexual material. The ultimate goal is to shift the burden from the survivor to the systems of power, holding tech companies fully accountable for the nonconsensual publication and spread of sexually explicit content on their platforms. Sustainable prevention, the report concludes, is only possible through policies and accountability measures that are directly informed by the leadership of survivors with lived experience.
