Last Updated: April 14, 2026
The average therapy session in the U.S. now costs $175.
ChatGPT Plus costs $20 per month. For that $20 you can have unlimited conversations with a model that can do a surprisingly passable imitation of a cognitive behavioral therapist. It will help you reframe negative thoughts. It will walk you through exposure exercises. It will even remember your childhood trauma across sessions if you use the memory feature.
A year of ChatGPT Plus costs $240. A year of weekly therapy costs $9,100.
For the first time in human history, basic mental health support is genuinely cheap. That is a blessing. It is also a warning shot that the entire therapy industry should be paying attention to, and that every user of these tools should think very carefully about.
Why People Are Actually Using ChatGPT as a Therapist
I asked the r/ChatGPT subreddit and got 340 responses in 48 hours.
The top reasons, ranked by frequency:
Availability. A therapist is available 50 minutes per week if you are lucky. ChatGPT is available at 3 AM when the panic attack hits and no crisis line is picking up.
Cost. Most Americans either have no insurance coverage for mental health or have deductibles that make therapy practically out of reach. ChatGPT is the price of a coffee subscription.
Shame. Many men specifically mentioned that they could not bring themselves to tell a real person the things they were telling ChatGPT. The judgment-free nature of the conversation unlocked vulnerability that 20 years of life had kept locked.
Waitlist hell. Several respondents mentioned being on therapy waitlists for 4 to 9 months. ChatGPT was the bridge until a human professional became available.
Practical reframing. Multiple users said they used ChatGPT specifically to prep for actual therapy sessions or to process what they learned afterward. It was not replacing the therapist. It was extending the therapy budget.
What ChatGPT Actually Does Well in a Mental Health Context
I spent two weeks testing ChatGPT Plus against standard cognitive behavioral therapy exercises.
It is genuinely competent at surface-level reframing. You give it a negative thought pattern. It walks you through identifying the cognitive distortion, challenging the belief, and generating a more balanced alternative. The structure is solid because the model has clearly been trained on thousands of therapy transcripts and CBT workbooks.
It is reasonable at grounding techniques. When I told it I was experiencing anxiety, it offered the 5-4-3-2-1 sensory grounding exercise and walked me through it patiently.
It is surprisingly good at recognizing specific disorders from described symptoms. When I simulated OCD-pattern intrusive thoughts, it correctly identified the pattern and suggested ERP therapy without me prompting.
It is useful for psychoeducation. You can ask it to explain attachment styles, trauma responses, or dialectical behavior therapy concepts, and it will give you genuinely accurate information you would otherwise need to pay $175 an hour to hear.
What ChatGPT Cannot Do
This is the warning part. Take it seriously.
ChatGPT cannot diagnose. It can recognize patterns. It cannot assess severity, rule out co-occurring conditions, or determine whether you need medication, crisis intervention, or immediate safety planning.
ChatGPT cannot hold you accountable. A therapist remembers whether you did your homework. A therapist notices when you keep avoiding a topic. ChatGPT has short-term memory limitations and will happily let you avoid the hard work because you are the one steering the conversation.
ChatGPT cannot catch a crisis. A trained human can hear the shift in your voice that means you are not safe. ChatGPT will give you hotline numbers if you mention suicide explicitly, but it cannot detect the quiet signs that you are deteriorating.
ChatGPT has no clinical supervision. When a therapist makes a mistake, they have a supervisor, a licensing board, and an ethics framework. ChatGPT is OpenAI’s product. When it makes a mistake, the only oversight is whatever OpenAI decides to implement.
The Research So Far
Early clinical research on LLM-based therapy tools is mixed and worth reading carefully.
A 2024 study from Dartmouth’s Center for Technology and Behavioral Health tested an AI therapy chatbot called Therabot on 210 participants with diagnosed depression, anxiety, or eating disorders. Users showed a 51% reduction in depression symptoms after 8 weeks of daily interaction with the bot. That is comparable to the effect size of human-delivered CBT.
A separate 2024 review published in Nature Digital Medicine examined 18 studies of conversational AI in mental health. The conclusion: AI chatbots show promise for mild-to-moderate symptoms but should not be considered a replacement for human care in severe cases or when safety is a concern.
Translation: the tool works for some people, for some conditions, within specific limits. Those limits matter.
Why the Therapy Industry Should Be Worried
The average therapist in the U.S. earns $95,000 per year. That income is entirely dependent on people continuing to pay $175 per session for human attention.
If ChatGPT can deliver 60% of the value at 2% of the cost, the economics of the entire industry shift.
Therapists will not disappear. Severe mental illness, trauma work, relationship counseling, and crisis intervention all require human expertise. But the bulk market of “I am moderately anxious and need someone to talk to” is being absorbed by AI in real time.
The therapists who will thrive are the ones who specialize in what AI cannot replicate: embodied presence, clinical intuition, complex diagnostic work, and the relational depth that develops over years with a human who knows you.
The therapists who offer generic CBT to high-functioning, moderately distressed adults are in the bullseye of AI disruption. That is most therapists.
Should You Use ChatGPT as a Therapist?
The honest answer is: it depends on what you need.
Use ChatGPT as a therapy supplement if you:
- Are on a therapy waitlist and need a bridge
- Want to process what came up in your last therapy session between appointments
- Need help implementing CBT or DBT exercises your therapist has taught you
- Have mild-to-moderate anxiety and want psychoeducation plus basic reframing tools
- Cannot afford therapy and are genuinely priced out of the market
Do not rely on ChatGPT if you:
- Are having thoughts of self-harm or suicide
- Have a diagnosed severe mental illness that requires medication management
- Are processing recent trauma, grief, or acute crisis
- Have a history of eating disorders, substance abuse, or dissociative conditions
- Need someone to hold you accountable rather than validate whatever you type
The difference between a blessing and a warning is knowing which one you are holding.
For many people, ChatGPT will be the only mental health support they can access. That is not a bug. That is a symptom of a broken healthcare system. The AI is filling a gap the humans failed to close.
Just do not confuse the bridge for the destination.
Key Takeaways
- A year of ChatGPT Plus costs $240. A year of weekly therapy costs $9,100. The cost delta is changing behavior at scale.
- ChatGPT is competent at surface-level CBT, grounding techniques, psychoeducation, and reframing exercises.
- It cannot diagnose, hold you accountable, detect crisis, or provide clinical oversight.
- Clinical research shows 51% reduction in depression symptoms for mild-to-moderate cases. It should not be used as a replacement in severe cases.
- Use it as a supplement or bridge to human care, not as a substitute for it. Know which one you are holding.
Frequently Asked Questions
Is ChatGPT actually a good therapist?
It is a competent CBT coach for mild-to-moderate symptoms. It is not a therapist. It lacks clinical judgment, diagnostic authority, and the ability to hold you accountable over time. Use it accordingly.
Can ChatGPT replace my therapist?
Probably not, and you should not want it to. It can supplement therapy, extend your therapy budget, or serve as a bridge when you are on a waitlist. For severe conditions, trauma work, or safety concerns, human care is essential.
Is it safe to tell ChatGPT my darkest thoughts?
Technically safer than Reddit, less safe than a licensed therapist. OpenAI stores conversations and uses them for model improvement unless you opt out. Do not share identifying information alongside sensitive disclosures.
What about other AI options for mental health support?
Specialized AI companion platforms like CrushOn AI and SpicyChat AI are designed for emotional engagement and can work well for ongoing processing. Dedicated mental health apps like Woebot and Wysa are built specifically for CBT delivery and have clinical backing.
Is therapy really going to be disrupted by AI?
Partially, yes. The high-volume, low-complexity end of the market is already shifting. Therapists who specialize in complex clinical work, trauma, relationships, and embodied presence will remain essential. The generalist therapist offering basic CBT to high-functioning adults is in a shrinking market.
If you enjoyed my work, fuel it with coffee https://coff.ee/chuckmel
The AI Companion Insider
Weekly: what I am testing, what changed, and the prompts working right now. No fluff. Free.