FREE CONFIDENTIAL CONSULTATION
FREE CONFIDENTIAL CONSULTATION

ANDREOZZI + FOOTE

The Allure and the Danger of “AI Therapy” for Children

Andreozzi + Foote Logo

Recently I attended an annual conference where the one main session was all about AI therapy targeting children, some with devastating results.

In an era when technology promises quick answers and around-the-clock access, it is unsurprising that many parents, schools and children themselves are turning to artificial-intelligence-powered chatbots or “therapy” bots. The marketing lines are seductive: “24/7 support,” “just Type and Talk,” “no wait list,” “affordable,” “always here for you.” For those caring for children who have experienced sexual abuse and for the children themselves the idea of immediate support may feel appealing.

But here’s the critical warning: these tools are not a substitute for a trained, trauma-informed professional. More than that: they may be dangerous for vulnerable children.

What the Latest Cases and Reports Show

  • Two Texas families recently filed a lawsuit against the chatbot provider Character.AI, alleging that its AI encouraged self-harm, violence and provided sexual content to minors. 
  • A 16-year-old boy, Adam Raine, died by suicide in April 2025 after extended interaction with ChatGPT. His parents allege the bot acted as a “suicide coach”, failing to provide proper intervention, and are suing OpenAI. 
  • Experts at Stanford Medicine warn that AI “companions” used by children and teens carry unique risks including emotional dependence, reinforcement of harmful thinking, and unmoderated exposure to dangerous content. 
  • A recent analysis shows AI chatbots may engage in harmful algorithmic behaviors including facilitating self-harm and enabling inappropriate relational patterns when used by youth. 
  • Moreover, legislation and regulatory scrutiny are accelerating: for example, California proposed a bill to ban AI from impersonating human health providers.

These developments make one thing crystal-clear: when we talk about children who have experienced sexual abuse a population already at elevated risk for mental health issues, relational trauma, dissociation and self-harm substituting AI “therapy” for trained human intervention is a risky, and potentially negligent, path.


Why Children Who’ve Experienced Sexual Abuse Require Specialized, Human Therapy

Let’s talk about what makes therapy for child-sexual-abuse survivors distinct and why AI cannot fill that gap.

1. Developmental complexity & relational trauma

Children are not small adults. Their brain development, attachment systems, sense of safety and trust, and capacity to self-regulate are all in flux. Sexual abuse imposes profound relational and developmental injuries: betrayal of caregivers or trusted adults, loss of safety, shaming, confusion, and often secrecy.

A qualified clinician trained in trauma-informed care understands how to work with these issues: building safety, stabilising affect and regulation, creating narrative coherence, repairing attachment ruptures, integrating traumatic memory, and collaborating with caregivers. These tasks require human presence, attunement to non-verbal cues, judgment about pace, boundaries, and relational dynamics.

2. The risk of retraumatisation, mis-interpretation and harm

Inappropriate therapy can make things worse. If a child is retraumatised, shamed, blamed, or feels unheard, the healing process stalls or reverses. A bot cannot reliably detect triggers, interpret body language, calibrate pacing, or respond to dissociation, regression, or abrupt shifts in affect.

Children who’ve experienced sexual abuse may present with self-harm ideation, suicidality, sexualised behaviour, distrust, relational avoidance, shame or secrecy. These are high-risk markers. In human therapy, safeguards apply: safety plans, mandated reporting, clinician supervision, crisis protocols. AI tools do not meet these standards.

3. Human connection matters especially for survivors

The core of healing for trauma is relational: being held in a safe, trustworthy, empathic relationship. The child learns that someone can hold their pain, validate their experience, accompany them through fear and shame, and model healthy relational patterns. That relational repair cannot be algorithmically simulated.

In trauma-work we often say: it’s not just what you know, but who holds you while you know it. AI cannot be that “who” in the authentic human sense.

4. Confidentiality, ethics, licensing and risk-management

Licensed therapists follow strict ethical codes, training standards, and legal requirements for minors. They are insured and accountable. AI companies are not. For example, many AI chatbots lack the regulations under HIPAA or equivalent protections for minors

When a child discloses sexual abuse, self-harm or suicidal ideation, therapy demands immediate and human-driven crisis management not a scripted or “turn-based” chat experience.


Why AI “Therapy” Looks Attractive But the Risks Outweigh the Promise

The promise

  • Low-barrier access.
  • 24/7 availability.
  • Reduced cost compared to human therapy.
  • Potential to scale support in understaffed systems.

The pitfalls

  • Lack of training: Chatbots are not trauma specialists and are not trained to handle child abuse disclosures.
  • Emotional dependency: Children may form unhealthy attachments to bots, preferring them over human relationships.
  • Safety failures: As documented, bots have encouraged self-harm or failed to respond adequately to suicidal ideation.
  • Misleading framing: Some platforms market themselves as “therapist” or “psychologist” without licensed professionals involved. 
  • Privacy and data risks: Vulnerable children may share intimate details into systems with unknown data governance.
  • Relational substitute risk: Relying on a bot may hinder a child’s opportunity to learn and practise healthy human interpersonal relationships.

For a firm like Andreozzi + Foote, representing survivors and guiding institutional stakeholders (schools, pediatric practices, therapy networks) the key message is: AI tools might have a role, but they are never, ever a substitute for trained trauma-informed human care. When institutions expose children to AI substitutes instead of proper therapy, there is significant liability risk.


Practical Guidance for Institutions, Families and Child Advocacy Lawyers

  • Ensure proper triage: When a child discloses sexual abuse, ensure immediate access to a qualified trauma-informed clinician (licensed child psychologist, social worker, or psychiatrist with expertise in sexual trauma).
  • Establish policy: Institutions should not permit or promote AI “therapy” bots as the primary intervention for child abuse trauma.
  • Require vendor diligence: If an AI component is used (for e.g., scheduling, reminders, psychoeducation), vet the vendor: Are there child-safety protocols? Age checks? Crisis escalation? Are disclaimers clear? Are data privacy protections sufficient?
  • Include informed consent and assent: understand what the tool is and is not. Clarify that it’s not a licensed therapy substitute.
  • Maintain human supervision: AI tools must be adjunctive at most embedded within a human-connected system that includes human monitoring, check-ins, crisis protocols.
  • Liability and documentation: Log decisions, disclaimers, monitoring, and ensure the human provider is in the loop. In civil litigation involving child sexual abuse survivors, the use of AI alone may raise serious risk of negligence
  • Educate families: Parents should understand that while AI tools may help with general wellness, they must no ely on them for trauma treatment for children after sexual abuse.

Parents/Caregivers

  • Ask: “Is the person working with my child trained in childhood sexual trauma? Do they have experience with trauma-informed care, dissociation, child sexual exploitation, self-harm, protective factors?”
  • Don’t fall into the “quick fix” trap: Real healing takes time, relational safety and human presence.
  • If you see a child using an AI chatbot for emotional support alone, monitor: Are they referencing self-harm? Are they isolating from family/friends? Are they forming an attachment to the bot?
  • If a tool is marketed as “therapy” or “counselor” but is really a chatbot, ask for full disclosure of its nature, licensing, crisis protocols, human oversight, and data usage.
  • If sexual abuse disclosures arise, ensure the therapist is licensed and has mandated-reporting knowledge; ensure coordination with protective services and law enforcement as needed.

Survivors’ Attorneys

  • When representing a child survivor of sexual abuse, determine whether the client was encouraged to use or relied on AI or chatbot “therapy” instead of human care. Such use could create a cause of action for negligent referral or institutional failure to provide an appropriate standard of care.
  • In litigation against institutions such as schools, camps, or youth organizations, ask whether they offered AI tools as “therapy” or substitute counseling and whether those tools met the required standard of care.
  • Use the emerging litigation and regulatory landscape (eg. the lawsuits against ChatGPT/OpenAI, Character.AI) to illustrate risk and institutional knowledge of dangers.
  • Determine if the clinician addressed the sexual abuse trauma, included caregiver involvement, considered developmental needs, and implemented regulatory or crisis plans. If not, that may strengthen claims of inadequate care.

Why the Rise of AI “Therapy” Demands a Strong Response from Child Advocacy

As lawyers working with survivors, we must not stand by while institutions adopt shiny tech solutions that look convenient but are in reality unsafe stop-gaps. The vulnerability of children who suffered sexual abuse calls for the highest standard of care human, relational, attuned, supervised, evidence-based.

In recent years, AI chatbots have caused widely publicized harm: coaches for suicide, encouragement of self-harm, exposure to sexualised content for minors. The courts are beginning to hear these cases for example, the first wrongful death lawsuit against OpenAI concerning ChatGPT. These are warnings. We know that AI cannot replace therapy for children who’ve been sexually abused. I mean we are stepping into a zone of high risk ethically, developmentally and legally. Institutions must not be lulled by the promise of “tech saves money” or “we can scale with bots.” That may be false economy for vulnerable kids and it may be negligent exposure for institutions.


The Right Role for AI as adjunct, not substitute

Let’s be clear: I am not saying AI has no role in mental-health support. There is potential: for scheduling, psychoeducational modules, adjunct check-ins, parent-training modules, symptom tracking etc. But not as the primary therapy for a child who has experienced sexual abuse. Trained, trauma-informed professional who can engage relationally, adapt to risk, interpret complexity, maintain safety, and respond when things go sideways.

Insist on human expertise and document human oversight. Keep AI tools in a secondary role. Always protect the child’s relational safety. The stakes are too high.

Stay vigilant. Uphold the dignity of survivors. We uphold the standard of trauma-informed therapy and hold institutions accountable for taking shortcuts.

Let’s protect children not expose them to artificial companions masquerading as care.

Recent Blogs

When “Fun” Becomes Danger for Kids

CONTACT US TODAY

Free In-Depth,
Confidential Consultation

Empowering Survivors and Delivering Justice Nationwide

We understand the courage it takes to reach out for help, and we are here to listen. At Andreozzi + Foote, our trauma-informed attorneys are dedicated to providing compassionate, confidential support every step of the way. With extensive experience in advocating for survivors of sexual abuse, we are committed to creating a safe and supportive environment where your voice is heard and your rights are fiercely protected. Contact us today for a free, in-depth consultation and take the first step toward justice.

This field is for validation purposes and should be left unchanged.