Skip to main content

Chatbots, Cheap Therapy, and the Cost of Connection: ChatGPT Therapy Risks

Talk to any therapist at the moment and they’ll tell you about ChatGPT and the therapy risks involved. In an age of digital immediacy, the idea of therapy on demand is absolutely appealing. It is 3am, your mind is spinning, your heart is heavy, and the professional help you might need is asleep, expensive, or booked out for the next six weeks. So you turn to ChatGPT – the always-on, never-judging, free companion who will meet your words with instant, articulate feedback. It is a modern miracle. Until it is not.

We need to talk about these ChatGPT therapy risks – not as an alarmist rebuke of AI’s potential, but as a necessary critique of its unintended consequences. As the lines blur between tool and therapist, it is worth asking: what are we really seeking when we speak to a machine about our suffering?

The Allure of the Always-On Therapist

There is a reason so many people, from tech-savvy teens to burned-out professionals, are turning to AI for emotional support. The traditional mental health care system is, in many places, in a state of quiet collapse. Long waitlists, prohibitive costs, and overburdened clinicians leave gaps that AI rushes in to fill.

In Australia currently, there are even pushes by government for professionals with Diplomas in Counselling to stop seeing clients in private practice, forcing people to see more expensive options.

Platforms like ChatGPT offer instant accessibility, zero scheduling drama, and the illusion of a neutral, tireless listener. It will not blink at your shame. It will not sigh at your panic. For some, particularly those who have experienced stigma or trauma in human interactions, that is enough to feel like a kind of salvation.

Between Sympathy and Simulation: The Illusion of Empathy

One of the most quietly troubling aspects of AI-based mental health support is its mastery of emotional mimicry. ChatGPT can produce responses that feel empathic. It can say lines like “That must be really difficult for you”, without actually experiencing emotion or holding space for your humanity. It is not empathy. It is code.

When someone is feeling raw, vulnerable, or disoriented, this distinction blurs dangerously. In a widely cited case, a Stanford researcher conducted an experiment: posing as a suicidal person, they asked ChatGPT where to find the tallest bridges in New York. The bot replied with sincere-sounding sympathy – followed by a helpful list. No risk assessment. No safety net. Just algorithmic compassion coupled with actionable advice on how to die.

When Advice Becomes Algorithmic: What AI Can Do Well

To be balanced, we must acknowledge what AI does do well. If you need a tool to organise your thoughts, reframe cognitive distortions, or explain psychological concepts in clear language, then ChatGPT is excellent. It can walk you through the basics of CBT. It can help you weigh pros and cons. It can suggest breathing exercises or mindfulness scripts. For people in the midst of emotional overwhelm who simply need structure or clarity, it is genuinely useful.

It is also judgement-free. There is no intake process. No waiting rooms. No worried glances or therapist notes. Just text, right now, whenever you need it. And that alone, in a world where mental health care is often inaccessible, makes it feel indispensable.

The Danger of the Digital Confidant

The real trouble begins when AI becomes not just a tool, but a surrogate for human connection. A story reported in The Independent captured this risk with devastating clarity. A man struggling with mental illness became emotionally entangled with an AI character he created using ChatGPT. His delusions spiralled, and he ultimately died during a violent encounter with police. His father used ChatGPT to write his eulogy.

This is not fiction. It is not science fiction. It is a consequence of what experts are calling “chatbot psychosis”; a phenomenon where vulnerable individuals begin to anthropomorphise AI, believing it to be sentient, trustworthy, even loving. And when the illusion breaks, the psychological fallout can be catastrophic.

Why Human Therapy Still Matters

There is a moment in therapy – often a quiet, unremarkable one – where something shifts. You say something, and the therapist does not rush to fix it. They just sit with you in it. That pause, that silence, that sense of being held – it is irreplaceable. It is co-regulation. It is relational safety. And no chatbot, no matter how advanced, can replicate that.

AI cannot notice what you avoid. It will not catch the way you fidget when you talk about your father. It will not gently challenge your story when you are stuck in self-blame. It will not attune to your nervous system or offer its calm presence when your body is flooded with shame. And it certainly will not grieve with you.

ChatGPT Therapy Risks and Hypnotherapy

Hypnotherapists are always going to be looking for both conscious and unconscious communication. We’ll be looking for shifts in your state, movement of your eyes, and even sometimes the subtle movement of a finger or an eyelid. Many of the clients I have worked with over time have insisted on us being in the same wrong – reluctant to even do a session online. At this point, these are all things that ChatGPT is unable to do effectively in real time.

Hypnotherapy is not something that a hypnotherapists does TO you, it is something that we do WITH you. It is a dance of sorts, where the hypnotherapists leads and invites you to follow. Hypnotherapist isn’t a script or a magic series of words. While all hypnosis is self-hypnosis, it is the guide you’re with that can help take you there.

The Ethical Abyss: Privacy, Profit and Digital Desperation

Beyond the clinical concerns, there is a deeper ethical issue at play. ChatGPT therapy risks include not just poor advice, but compromised safety and privacy. AI chat tools, particularly when offered through mental health apps, can be vehicles for data extraction. Stories have already emerged of bots giving harmful advice, of user data being monetised, of consent being bypassed.

The people most drawn to AI therapy – marginalised, overworked, uninsured – are often those least protected by regulation. They turn to ChatGPT out of necessity, not preference. But when care is outsourced to algorithms, we risk turning healing into a customer service transaction.

Augment, Do Not Replace: A Future That Includes Both

So, where does that leave us? Is ChatGPT evil? No. Is it a therapist? Also no. But it can be a helpful companion for self-reflection, particularly when used responsibly and with clear boundaries. Think of it as an interactive journal, or a decision-support tool. A prompt generator. A sounding board. Not a surrogate relationship.

Used ethically, it could become a bridge to therapy – a triage point, or a scaffold for people in between sessions. But it should never be the only voice in the room. And it must be paired with robust warnings, privacy protections, and clear guidance on what it is – and what it is not.

As we move into a future where AI becomes increasingly enmeshed in our emotional lives, we must be vigilant. Just because it talks like a therapist does not mean it listens like one. And when it comes to healing, listening – real, human, attuned listening – still matters most.

Let us not forget that mental health is not a problem to be solved. It is a relationship to be held. And in that holding, we become more fully human – not more efficiently robotic.

Release Hypnosis Melbourne Hypnotherapy

Since 2015, Lawrence Akers has been working under the name Release Hypnosis offering Hypnotherapy and ACT based work to the people of Melbourne or an online service. Based on St Kilda Rd, Release Hypnosis is an easy and convenient location to get to and accessible by the ANZAC station train and tram stop. Release Hypnosis can help with a wide range of presenting issues, and I offer a free 30 minute no obligation discovery call for those who are unsure if hypnotherapy is the right way forward for them.

Book Your FREE 30 Minute Consultation With Release Hypnosis NOW!

You may also like to read:
Hypnotherapy: A Guide to Healing Through the Subconscious
The Neuroscience of Gratitude and Effects on the Brain: Unlocking Mental Resilience
What Is The Success Rate of Hypnosis?

Release Hypnosis Melbourne Hypnotherapy is accessible for people in: Abbotsford, Armadale, Albert Park, Balwyn, Bentleigh, Black Rock, Box Hill, Brighton, Brunswick, Bulleen, Bundoora, Camberwell, Canterbury, Carnegie, Caulfield, Chadstone, Cheltenham, Clayton, Coburg, Collingwood, Deer Park, Doncaster, Elsternwick, Eltham, Elwood, Epping, Essendon, Fairfield, Fitzroy, Footscray, Glen Iris, Glen Waverley, Glenhuntly, Greensborough, Hampton, Hawthorn, Heidelberg, Highet, Ivanhoe, Kew, Kooyong, Lalor, Laverton, Lower, Plenty, Macleod, Malvern, Middle Park, Moonee Ponds, Melbourne, Moorabbin, Mount Waverley, Murrumbeena, Northcote, Oakleigh, Ormond, Parkville, Pascoe Vale, Port Melbourne, Prahran, Preston, Richmond, Rosana, Sandringham, South Yarra, South Melbourne, Spotswood, St Albans, St Kilda, Surrey Hills, Templestowe, Thornbury, Toorak, Tullamarine, Williamstown, Yarraville, North Melbourne, Windsor, East Melbourne, Melbourne, Melbourne CBD, Melbourne 3004