Finally Holding Online Platforms Responsible for Child Exploitation
For years, survivors of child sexual abuse material (CSAM) and non-consensual intimate images have fought an uphill battle to get their images taken down. They also strive to hold platforms accountable. This month, the Federal Trade Commission (FTC) along with the state of Utah filed a major action against Pornhub and other adult websites. They are accused of misleading users about their efforts to remove illegal and non-consensual content.
This is an important moment in the fight for digital safety and survivors’ rights. Here’s why.
The FTC’s Allegations
According to the FTC’s complaint, the platforms promised quick removal of CSAM and non-consensual videos but often failed to act. As a result, they profited from illegal content by monetizing ad views. They also did not provide users with effective reporting tools or appeals processes.
If proven, these violations could result in significant penalties. There might also be mandatory reforms to how platforms monitor, remove, and report illegal material.
Why This Matters for Survivors
Survivors of CSAM are helpless and devastated by the ongoing harm of knowing their abuse is being viewed online, sometimes millions of times. Even when images are removed, they frequently reappear on other platforms. Nothing is ever truly wiped off the internet. The FTC’s action signals a new era of enforcement. Companies cannot hide behind empty promises while profiting from exploitation.
Civil Justice and Platform Liability
In 2018, Congress passed FOSTA and SESTA, the Allow States and Victims to Fight Online Sex Trafficking Act and the Stop Enabling Sex Traffickers Act. These laws amended the Communications Decency Act. They hold online platforms legally accountable when they knowingly assist, facilitate, or support sex trafficking. Lawmakers designed FOSTA-SESTA to curb sex trafficking. However, critics argue that the laws harm sex workers by forcing platforms to censor speech. They also shut down safety tools such as client verification systems and “bad date lists” that sex workers use to screen clients and protect themselves.
While Section 230 of the Communications Decency Act has historically shielded platforms from liability, FOSTA/SESTA created an exception. This applies to sites that knowingly facilitate sex trafficking. Now, survivors may have stronger claims when platforms:
- Fail to respond to takedown requests.
- Ignore evidence of illegal content.
- Monetize known CSAM.
Andreozzi + Foote is closely watching these developments and exploring avenues for civil litigation that could help survivors recover damages.
Resources for Survivors
- NCMEC CyberTipline: Report CSAM at cybertipline.org
- Take It Down: A free tool to help minors remove explicit images online (takeitdown.org)
- RAINN National Sexual Assault Hotline: 1-800-656-4673
Our firm represents survivors of online exploitation and holds companies accountable for their role in perpetuating harm. If your images were shared without consent or you are a victim of CSAM, contact us today to discuss your options.