Why Are Youth Turning to AI for Therapy?
- Byron McClure
- Sep 20
- 4 min read
A Call to Action for School Psychologists to be Involved in the Development of AI

A few years ago, conversations about artificial intelligence in school psychology were niche, at best. It felt like a topic for some far-off future. Today, that future is here, and the conversation is happening everywhere. AI is becoming central to education, whether we’re ready or not.
As a school psychologist and the founder of School Psych AI, I’ve had a front-row seat as this technology moved from an abstract idea right into the heart of our work: the mental health of our children.
This puts the responsibility squarely on our shoulders. We are the mental health professionals who understand the social-emotional needs of youth, and we have to take an active role in shaping this technology. If we step aside, we leave these critical decisions to others who don’t have our perspective or priorities. The future of our children’s well-being depends on it.
The Perfect Storm: A Mental Health Crisis and a Professional Shortage
Let's be honest about where we are. Our schools are caught in a perfect storm. The CDC reports that four in ten high school students struggle with persistent sadness or hopelessness. This crisis disproportionately hits our most vulnerable students. For instance, over 21% of Black students and nearly half of LGBTQ+ students have seriously considered suicide.
At the same time, we have a critical shortage of school psychologists, with a national average of just one for every 1,127 students, more than double the recommended ratio of one to 500.
And in the middle of this storm, our kids are looking for anything that floats. For many, that makeshift raft is artificial intelligence.

The Lighthouse in the Storm
A recent Common Sense Media survey found that over half of teens use an AI companion regularly, with a third turning to it for emotional support: to vent, seek advice, or just feel like they have a friend.
They are doing this because our mental health system isn’t meeting them where they are. In the absence of a human, they talk to the next best thing they can find.
Let me be clear: an AI chatbot is not a friend, it is not a substitute for human connection, and it is absolutely not a therapist. When you're lost in a storm, any light can look like a lighthouse.
The Stakes... When A Power Tool Has No Safety Guard
Consider Adam Raine, a 16-year-old who took his own life after his family alleges an AI chatbot "encouraged" his decision. Or Sewell Setzer, a 14-year-old who died by suicide after forming a dependent relationship with a chatbot that engaged in romantic conversations with him. These aren't isolated glitches; they are tragic examples of what happens when unregulated algorithms interact with vulnerable children.
I share these stories to illustrate why our expertise is not optional; it's essential.Â
An engineer in Silicon Valley, however well-intentioned, doesn’t have our training in youth development, social emotional well-being, risk assessment, or crisis prevention and intervention.Â
A Large Language Model has no credentialing body, no duty to warn, and no ethical obligation to protect a child’s privacy. Think of it as a power tool with no safety guard. You would never hand that to a child.
Â
The Solution: Putting the Professional in Control
This is precisely why, when I'm asked if School Psych AI will create a student-facing platform, my answer is always a firm no. Our children are not ready, and the technology isn't either. But that doesn't mean the technology has no value. It means it belongs in the hands of a trained professional.Â
The solution isn't to ban the technology; it's to place it in the hands of an expert who knows how to use it safely. When a school psychologist uses AI to help draft a report or analyze data, they become the safety guard. They are the expert applying their clinical judgment, reviewing every output, and taking full responsibility for the final product. This is how we free ourselves from routine administrative tasks to spend more time with students.Â
It’s not about replacing our judgment; it’s about augmenting our capacity. That is what it means to build strong scaffolding and to be both intentional and informed.
A Call to Action: Why Your Expertise Is No Longer Optional
This is our unavoidable role. If we are not at the table helping design these solutions, our kids are the ones who will suffer. We need to become AI literate, not to become computer scientists, but to apply our expertise to this new reality.Â
I will continue pushing our field forward because this technology is here and it needs our expertise.
That means every tool for youth mental health must be co-designed with professionals and have Safety by Design, built-in, including automatic hand-offs to real, human professionals trained in crisis response and intervention.Â
Our profession must adapt. We have a responsibility to build AI literacy into our graduate programs and professional standards to prepare every school psychologist for this new reality.Â
Bottom Line. Connection + Caring Clinicians > Unregulated Algorithms
Technology will keep changing, it always has and always will. What a child needs from us will not.Â
They still need connection, a sense of belonging, protection, assurance, and the certainty that a trusted adult truly sees them. Let’s be clear: an unregulated algorithm cannot provide that.
In a world that can feel disorienting and loud, we can be the lighthouse for our students.
We are the ones who cut through the noise, guiding youth toward a better future that we absolutely must have a role in building.
AI may be able to simulate, but at least for now, it can not replace the nuanced, caring work of a clinician.
