In an era where technology evolves at lightning speed, so do the methods of those seeking to exploit it for nefarious purposes. The recent case in York County, Pennsylvania, Luke A. Teipel, 22, of Dallastown, is charged with 33 felony counts of possession of child sexual abuse material, including the artificially generated images, and one count of criminal use of a communication facility. Thus, bringing the fight against child exploitation to Pennsylvania’s doorstep.
This is the first charge since Pennsylvania enacted a law in 2024 specifically criminalizing the creation and possession of AI-generated CSAM. ACT 125, which updated prior laws to prohibit the use of artificial intelligence technology to create materials that appear to “authentically depict a child under 18” engaging in sexually abusive acts that did not occur in reality.
The Alarming Rise of AI-Generated Child Exploitation
Artificial Intelligence, while offering numerous benefits, has been weaponized by predators to create hyper-realistic, synthetic images of child abuse. According to Thorn, 1 in 6 minors who are involved in a potentially harmful online sexual interaction never disclose it to anyone.
According to the Internet Watch Foundation (IWF), there was a 380% increase in reports of AI-generated illegal imagery in 2024 compared to the previous year, totaling over 7,600 images and videos. Disturbingly, nearly 40% of this content fell into “category A” — the most extreme classification, including penetration and sadism.
These images are not just digital fabrications; they represent a new frontier of victimization, where the lines between real and synthetic abuse blur, causing real psychological harm to victims and their families.
Legal Systems Struggling to Keep Pace
The legal landscape is racing to catch up with the rapid advancement of AI technologies. While Pennsylvania has taken a significant step by enacting laws against AI-generated CSAM, many jurisdictions are lacking. This legal ambiguity hampers the ability of law enforcement and prosecutors to hold perpetrators accountable.
Moreover, the internet’s global nature means that content created can quickly spread worldwide, necessitating international cooperation. Currently, we lack standardized legal frameworks to combat this issue effectively.
However, PA is starting to become a leader in this fight with the legislation, and a recent lawsuit was filed in Lancaster, PA.
The Role of Advocacy and Legal Support
Advocacy and legal support are the tools we use to create systematic change. Andreozzi + Foote is dedicated to representing survivors of sexual abuse and exploitation. Ensuring they have a voice and access to justice. By providing trauma-informed legal counsel, pursuing civil litigation against perpetrators, and advocating for stronger protective laws, we actively combat the proliferation of AI-generated CSAM.
Survivors and their families need to know they are not alone. Legal avenues exist to seek redress, hold offenders accountable, and push for systemic changes that prioritize the safety and dignity of all individuals, especially children.
Taking a Stand: How You Can Help
Addressing the crisis of AI-generated child exploitation requires a collective effort:
Educate Yourself and Others: Understanding the nature of AI-generated CSAM is the first step in combating it.
- Advocate for Stronger Laws: Support legislation that criminalizes the creation and distribution of AI-generated CSAM and provides clear guidelines for prosecution.
- Support Victim Advocacy Groups: Organizations working on the front lines need resources and public backing to continue their vital work.
- Report Suspicious Content: If you encounter suspected CSAM, immediately report it to the appropriate authorities.
- Help Victims: The National Center for Missing & Exploited Children has created a tool to help Take Down these images.
Seeking Justice and Providing Support
The families impacted must find support and understanding in the legal system. It’s crucial to hold accountable those who misuse technology to harm others. SAndreozzi + Foote is holding institutions accountable for failing to act in cases like these. The impacts of AI-Generated CSAM are vast leading to:
- depression
- substance use
- shame
- fear
- reputational harm and more
Together, we can build a framework of awareness, support, and education by holding entities accountable for AI-Generated CSAM. 1-866-753-5458