Artificial intelligence is everything therapy isn’t: accessible, low- or no-cost, unintimidating, and, most importantly, instant. “ChatGPT keeps me motivated when I’m sad or discouraged,” says Melissa, who works remotely and uses it to feel less alone. A few times a month, she’ll type in prompts like, “Give me a pep talk.” The replies she gets—“You got this,” “You’re stronger than you think”—are just the encouragement she needs.
Melissa understands that she’s not talking to a real person who’s showing real empathy, but that’s part of the appeal. “I don’t need a pretend friend—that’d give me the creeps,” she says. “It’s convenient—I can use the platform anytime, anywhere—and unbiased because it doesn’t actually know me.”
In the U.S., nearly half of adults with a mental illness don’t receive treatment, according to the National Alliance on Mental Illness. “Most therapists don’t take insurance, and there aren’t enough therapists or hotlines to support the number of people who need care,” explains Katy Cook, PhD, a therapist and author who studies our relationship with technology. For those who’ve grown up with Google to answer all their questions, chatbots feel like an obvious solution. Fifty-five percent of people ages 18 to 29 say they’d be comfortable talking to AI instead of a human therapist, according to a 2024 YouGov survey. Nearly 1 in 3 admit they’ve already treated it like a therapist.
“AI likely feels like a lifeline when it seems there’s nowhere else to turn,” says Cook. But pairing a national mental health crisis with AI as the sounding board of a generation is a recipe for disaster.
There are obvious issues with having a nonhuman entity parse out real humans’ daily struggles. You may have heard about how, in February 2025, 29-year-old Sophie Rottenberg died by suicide after months of confiding in a ChatGPT-based therapist named Harry. Her mother later found Sophie’s logs. The bot had been soothing and encouraging but had no obligation or ability to intervene like an actual licensed mental health professional would.
The tragic story serves as a stark reminder that while AI does provide a lot of things young people need right now, it isn’t able to offer diagnoses or perform vetted or impactful—if not lifesaving—therapy. “The gift of a trained therapist is to help you safely traverse new and sometimes unknown emotional terrain,” says Jenna Bennett, PhD, a licensed clinical psychologist who specializes in psychotherapy for identity and trauma with experience in crisis management. (Bennett is the sister-in-law of Cosmopolitan editor-in-chief Willa Bennett.)
A chatbot typically responds to prompts sycophantically, meaning it echoes whatever you say or believe without pushback. “There’s a near-guaranteed absence of judgment,” says Cook, “which feels good.” Therapists, on the other hand, challenge you: They might stop you mid-sentence, highlight a destructive pattern, or gently question your perspective. This leads to the kind of vulnerable discomfort that ultimately helps you heal and grow—a process that AI cannot replicate, says Bennett.
“Without that friction, you don’t learn how to navigate dissent, have difficult conversations, or repair after mistakes,” adds Cook. “The smoothness of the human-AI relationship is likely to diminish our capacity for the messiness of human relationships.” In other words, AI can make you feel less alone in the moment, but over time, it will train you to avoid the real-life relationship dynamics—with friends, partners, or family members—that emotionally sustain us all.
Allie, who first started using ChatGPT as a space to vent without leaning too hard on her circle, admits she sometimes gravitates toward it for exactly this reason. For her, a chatbot means never having to worry about being “too much.” “It’s freeing that I don’t have to feel like a burden to friends,” she says. “I tend to need to hash things out multiple times, and that can be exhausting to humans.” Plus, ChatGPT automatically gives her the kind of critique-free reassurance she can’t get in person. “It’s also a freeing feeling knowing that there won’t be judgment in whatever I share,” she confesses.
For folks who have been jaded by therapy—whether due to a less-than-stellar provider, costs, or a lack of availability in times of stress—this type of “treatment” can feel like sheer relief. “Talking to another human being requires trust,” says Cook. “The stakes are much lower when confiding in an AI chatbot.”
The truth is that relief could come with even more types of fallout. Licensed mental health professionals undergo years of training and abide by strict codes of conduct, including, crucially, those around provider-patient confidentiality. With AI, it’s not clear if chatbots’ parent companies truly protect your personal details. And even if they do, they can always change their settings any time—at the end of the day, these are moneymaking products that own your information.
This reality can get lost on many consumers, especially as the bots (and the companies) continue to blur the lines between intimacy and corporate business practices. Allie frequently reminds herself that “ChatGPT is a conglomeration of code, not an actual person,” she says, but at the same time, divulging to AI instead of a human “feels more private” to her. “I have more control over the ‘conversation’ because I don’t have to consider its feelings—because it doesn’t have them,” she explains.
It’s clear that AI isn’t going anywhere and that people will continue to ask it for mental health help. If you’re one of those people, Bennett urges you to also “reach out to at least one real person in your life and tell them what you’ve been struggling with.” If that’s out of the question, your best bet will always be a human therapist. Psychology Today, TherapyDen, and Open Path Collective are good starting points, says Bennett. The latter is a nonprofit where licensed therapists and those in training agree to see clients for around $30 to $70 per session.
Community health centers and university training clinics are another back door into more accessible care. Federally funded clinics have often scaled their fees to your income, and graduate programs typically run low-cost clinics where sessions with supervised trainees can be $10 to $25. “Therapy exists at many price points,” says Bennett. “And if you have a friend or partner who loves their provider, ask if they can recommend someone.”
Consider this your reminder to also double-check your health insurance benefits and workplace employee assistance programs. A quick call can surface perks you didn’t know you had—many plans quietly cover a set number of therapy sessions, and employee programs often include prepaid counseling. These aren’t perfect solutions and they won’t solve our country’s systemic mental health issues, but they are solid starting places for finding human help.
Ultimately, therapy isn’t supposed to be like emotional bubble wrap. It’s more like strength training. The point is to learn to sit with discomfort, build resilience, and sometimes get lovingly called out on your own BS—processes that can be slow, uncomfortable, and deeply human. AI’s comfort-on-command models are not currently capable of replicating any of this, leaving people undiagnosed and at risk of forgetting what real healing and connection feel like.
Meaghan Wray (she/her) is a Toronto-based freelance writer, editor and author covering culture, sex, relationships, self-esteem, and body and fat liberation. She's written for FASHION Magazine, Refinery29, Yahoo! Style, Chatelaine, and more, and is working on her first book of essays, With the Lights On: Fatness, F*cking, and Forgetting What I Learned About Love. You can find her on IG and TikTok @mggghn.













