Can Sonny, The AI Chatbot, Provide Essential Mental Health Support For Students?
A teenager, alone in their bedroom, struggling with anxiety before an important exam. A student, feeling like they have no one to turn to after a fight with a friend. A high schooler, dealing with the kind of emotional weight that's too complicated to bring up in class. What if, instead of bottling it up, they had someone to text—someone who would actually listen?
This is the promise behind Sonny, an AI-powered "wellbeing companion" designed to fill the gaps left by a severe shortage of school counselors in the U.S. Created by Sonar Mental Health, Sonny isn't a therapist, nor does it pretend to be. Instead, it's an always-available digital confidant, helping students navigate everything from test stress to social drama—with a safety net of human oversight.

But can an AI-powered chatbot really support students in crisis, or is this a sign that we're outsourcing the work of human connection to machines?
The U.S. Education Department estimates that 17% of high schools don't have a counselor at all. Even in schools that do, the ratio of students per counselor is often staggering—sometimes exceeding 800 to 1. When mental health issues are on the rise and schools are stretched too thin, students are often left with two bad options: waiting weeks for an appointment or keeping their struggles to themselves.
Enter Sonny.
Unlike traditional therapy models, Sonny offers instant access to support via text, no appointment necessary. It's not just about crisis intervention—it's about having someone to talk to before things spiral.
How Sonny Works
Unlike most mental health chatbots, Sonny doesn't operate in isolation. Instead, it follows a human-in-the-loop model, where trained "Wellbeing Companions" (real people with backgrounds in psychology, social work, and crisis intervention) review every AI-generated response before it reaches a student. This balance aims to combine the efficiency of AI with the empathy of human oversight—ensuring that students aren't left with robotic or inappropriate responses.
The process is simple:
- Students text Sonny about anything on their minds—stress, friendships, family issues, or just venting about their day.
- AI drafts a response, tailored to the student's history and context.
- A human moderator reviews and refines the message before it's sent.
- If a student is in crisis, Sonny can escalate the situation to school officials or emergency services following a clinically informed protocol.
What Students Talk About With Sonny
Sonar Mental Health reports that students confide in Sonny about a wide range of topics, including:
- Academic stress: From overwhelming homework loads to college application anxiety.
- Friendships & relationships: Navigating breakups, social conflicts, and family issues.
- Loneliness & self-esteem: Feeling isolated, unheard, or struggling with self-worth.
- Everyday vents & celebrations: Not everything is a crisis—sometimes, students just need someone to share their small wins.
The chatbot even has proactive check-ins, meaning it doesn't just wait for students to reach out—it can detect signs of distress and offer support before things escalate.
Sonny isn't just about making mental health support available—it's about making it affordable and scalable. Hiring more counselors isn't always an option for budget-strapped school districts. AI-powered tools like Sonny offer an alternative that costs a fraction of traditional therapy services, while still reaching thousands of students.
Some key benefits for schools include:
- 24/7 availability—students can get support anytime, even outside school hours.
- Data-driven insights—Sonar provides schools with anonymized data on student well-being trends.
- Crisis management support—Sonny can escalate serious situations, reducing the burden on school staff.
For all of its strengths, Sonny raises serious questions about the future of mental health care in schools.
- Is it a solution, or just a Band-Aid? Sonny might help students in the moment, but it doesn't replace the need for long-term, in-depth counseling that many students require.
- Can AI ever truly understand human emotions? While Sonny is programmed to be empathetic, it still relies on algorithmic decision-making—which may not always catch the full nuance of a student's distress.
- What about privacy? Even though Sonny operates with strict data protections, the idea of AI collecting and analyzing student emotions is bound to raise concerns.
Whether we like it or not, AI is becoming a key player in the mental health space. From crisis text lines to therapy chatbots, digital tools are increasingly filling the gaps in a struggling system.
Sonny isn't here to replace counselors—it's here to bridge the gap between students in need and the overburdened mental health resources available to them.
For now, thousands of students are texting Sonny, finding comfort in a space where they can share their thoughts without fear of judgment. Whether AI-driven mental health support will ever be enough is still up for debate—but in a world where too many students suffer in silence, having someone—anyone—to talk to might just be the difference between despair and hope.