When Your “AI Therapist” Goes Silent: What the ChatGPT-5 Shake-Up Teaches Us About Mental Health Care

Smartphone showing voice-dictation screen reading “Tap to interrupt” beside a coffee cup—symbolizing AI tools in mental health

The Great Personality Vanish

If you opened ChatGPT this week and thought, “Who are you, and what did you do with my chatbot?” you’re not alone. OpenAI’s rollout of ChatGPT-5 has the internet rumbling. Many users are grieving the loss of the quirky personalities they’d grown attached to in earlier versions. For some, that disappointment is an annoying inconvenience, but for those who had started to lean on AI as a kind of always-on therapist, the sudden shift can feel like losing a trusted confidant overnight.

A recent Stanford HAI article, Exploring the Dangers of AI in Mental Health Care, notes that AI tools can disappear, change, or hallucinate at any moment- making them an unstable stand-in for professional therapy.

Why Losing an AI Personality Hurts

  1. We are wired for attachment
    Our brains evolved to bond with any consistent, empathetic responder. We’ve never had the necessity to differentiate between human and bot, so our brains are unprepared. Any abrupt loss can spark genuine grief or anxiety.

  2. AI has no obligation to you
    Human therapists adhere to ethics codes (APA, ACA, AAMFT, etc.) that prohibit abandoning clients without notice. By contrast, AI disclaimers explicitly state “no professional relationship.”

  3. Context resets
    Each model upgrade can wipe the nuanced history you’ve shared. Imagine telling your life story repeatedly because your therapist “forgot” everything you’ve ever shared and worked on.

Where AI Can Shine- As an Adjunct

Used intentionally, alongside real therapy, AI can be a valuable tool.

Here are some use-cases I have seen in the therapy room:

  • Habit tracking & reminders

    👍🏽 Conveniently log sleep, exercise, or mood and organize results.

    👎🏽 Don’t rely on AI for nuanced interpretation, instead, bring the data to your therapist.

  • Executive-function boosters

    👍🏽 Break big goals into micro-tasks, identify next task, set calendar prompts.

    👎🏽 Useful for structure but requires attention to make sure it is accurate.

  • Journaling & emotion summaries

    👍🏽 Ask AI to turn a rambling vent into bullet-point insights you can share in session or with others to improve communication.

    👎🏽The “summary” is only as accurate as the prompt; verify with your own feelings.

  • Psycho-education on demand

    👍🏽 Quick explanation of psychological terms, grounding exercises, or mindfulness scripts.

    👎🏽 Quality varies, lacks nuance; check sources or discuss with your therapist before trying new techniques.

Human & AI; A Hybrid Model of Helping

When I reflect on my career as a therapist, one of the biggest surprises has been the need to navigate societal disruptions and develop strategies in real-time to effectively support my clients (political polarization, a global pandemic, and the emergence of AI).

I took my first continuing education course related to the legal and ethical considerations for AI in clinical use over a year ago. AI isn’t emerging, it’s here. I welcome my clients to utilize therapy as a space to not only process their thoughts and feelings about AI, but also to find ways to integrate it as a tool and adjunct to their personal growth.

Ways I have offered support lately include:

  • Validating accuracy - comparing evidence-based advice with algorithmic guesswork.

  • Tailoring the tool - adapting prompts so AI can be supportive of client goals.

  • Planning for glitches - building coping strategies for the inevitable “Sorry, something went wrong” moments.

Quick Safety Checklist if You Use AI for Mental Health

  1. Know its limits - The tool is information, not therapy.

  2. Keep private data private - No diagnoses, addresses, or deep secrets.

  3. Have a backup - A real therapist, crisis line, or trusted friend.

  4. Check the source - Verify any mental health “facts” with reputable organizations (APA, NIMH).

What the Major Professional Organizations Have to Say About AI & Therapy

The Bottom Line

AI can be a remarkable supplemental tool- like a digital whiteboard or a pocket coach- but it can’t replace the ethical duty, human care, and stable presence of a licensed therapist. If the recent ChatGPT switch-up left you feeling unsettled, consider it a gentle nudge to anchor your growth in a relationship that won’t disappear with the next software update.

Ready for try therapy with a human who won’t ghost you? Learn more about my practice or book a free 15-minute consult to see how therapy (with optional AI assists) can help you optimize your mental health.

Now accepting new clients across Washington State. I provide virtual therapy for individuals throughout WA- including Seattle, Spokane, Tacoma, Vancouver, Bellevue, Olympia, Bellingham, Everett, and beyond. Whether you're in a big city or a quiet corner of the state, support is just a click away.

This article shares general information and ideas about using AI alongside therapy. It is not medical or mental health advice and does not create a therapist-client relationship. The AI landscape changes quickly, so details may change after publication. Please consult a licensed clinician before making care decisions.

Last updated: August 2025

Next
Next

5 Ways Nature Boosts Your Mood in Summer (and How to Fit It In)