It’s two AM. You’ve had yet another fight with your partner, you’re stressed, things feel hopeless, and you're full of fear of the unknown. You don’t know what to do. So, you turn to ChatGPT, the always-available, no-cost tool that millions count on for answers.
But this begs the question? Is AI a good, accurate, and effective therapist? Is AI safe to use as your own personal therapist? How is it being used in mental health right now? Can it be used to support you through tough times? Or are there limitations or even dangers to AI chatbots? In this article, we’ll learn what it means to use ChatGPT as a therapist, how AI could support you, and what you should be mindful of when using it.
- AI chatbots like ChatGPT are now widely used for mental health support
- AI is most effective when used to enhance therapy, not as a replacement
- The instant gratification of AI carries risks as a replacement for therapy itself
- AI lacks key elements of human therapy
- Innovative models such as OurRitual offer a safer, more effective approach
Rise of ChatGPT in everyday use
ChatGPT and other LLMs (Large Language Models) started as a great way to write that hard email or to speed up admin tasks. But this is no longer how we use AI. AI bots are now widely adopted, with many integrating them into their everyday lives. It is no longer just used to improve our emails; it is touching every area of our lives, from product recommendations and pattern recognition for sleep/nutritional analysis to calendar scheduling and task organisation. It is fundamentally changing the way people solve problems, organise daily tasks, and access information. There is really no area of our lives where AI is not involved, including our mental health and relationship challenges.
Adoption of AI in the mental health space
There is unprecedented adoption of AI for mental health support. In sessions, I have had clients tell me that they lean on ChatGPT every day for support, advice, and even to combat loneliness. What's the appeal of using AI chatbots for support, and does it support our mental health or subtract from it?
Surveys report high adoption of LLMs, especially ChatGPT, Gemini, and Claude, to address mental health challenges. People are using ChatGPT to address stress, anxiety, and for advice when they feel stuck. The mental health profession is experiencing profound shifts, with the integration of AI as an additional support to regular therapy sessions. Within the health care industry, some are saying it offers a unique opportunity to deliver high-quality, evidence-based care.
According to surveys conducted, AI chatbots have become one of the largest mental health providers. Mental health professionals (MHPs) and platforms are integrating monitoring, detection, and personalisation, specifically using AI to detect signs and shifts in people's emotions and feelings by analysing language patterns (monitoring and detection), and personalisation in the form of taking key insights from sessions with clients to review later. Furthermore, MHP utilises conversational chatbots to provide additional support to their clients, as they are available 24/7.
As a therapist, I’ve seen clients use it when they feel stuck or hopeless, it’s the middle of the night, and they don’t know what to say to their partner to make things better. Some have mentioned that it helped offer different perspectives on the situation and practical tools for navigating triggering topics with their spouse.
Other clients prompt ChatGPT to provide specific journaling prompts to help them work through challenges they are facing. Some couples even sit down and prompt ChatGPT together to come up with solutions to their complex challenges.
A couple said, “What was key was we were clear from our sessions what we felt, what was going on, and we even knew some of the psycheducational lingo. From there, we were able to sit down before the kids woke up and prompt ChatGPT to generate questions to answer together to start repairing. We even took the action steps you gave us from the session and your feedback to put into ChatGPT”.
In my experience with couples, I believe this was a smart approach that demonstrates self-awareness and understanding. From an empowered place, they prompted ChatGPT to solve a problem they had before the kids woke up.
LLM insights can be very helpful when used alongside your own specific insights and awareness
OurRitual leads the way in AI innovation, leveraging AI to facilitate and enhance change between sessions for our clients. Our Ritual provides a hybrid human-plus-digital support, meaning our clients receive the best of both worlds, creating impactful sessions, education, and AI insights for our members.
Pros of using AI to support mental health
Resourcing AI to complement traditional care rather than to replace it is powerful. The benefits of AI chatbots are significant, including 24/7 accessibility without an appointment and little to no cost (depending on the platform).
Additionally, chatbots are judgment-free. Conversing with LLMs means no worrying about what others think, no social stigma, no shame, and no fear that the therapist will hesitate or judge what you are saying. It feels completely private and anonymous. It supports quick decision-making, clarifies thoughts or feelings instantly, and even helps you put words to what you are experiencing.
AI provides cognitive reframing, re-shaping a negative thought into a more helpful one. It also provides psychoeducational information in an accessible way. It can take complex psychological principles and concepts and re-script them in a way that is easy to understand and apply.
With the advancement of AI, one major positive I’ve heard our clients discuss is how it captures all the key insights from sessions, prompting members and facilitating real, lasting change, ensuring nothing is forgotten or lost to the business of life.
Cons of using AI to support mental health
In a world of instant gratification, AI chatbots target the part of the brain (the Nucleus Accumbens) that serves as the reward centre. It activates the dopamine (happy hormone) reward-based pathway. Much like social media is orchestrated to keep lighting up this part of your brain to keep you on the platform, AI Chatbots do the same. The reward centre is stimulated by the quick resolution and fast response speed of AI Chatbots.
Additionally, because using LLMs is low-friction and high-reward, users feel they can ask emotionally vulnerable questions and receive instant gratification and validation. Our brain has a high preference for quick emotional relief rather than a longer process of therapy or self-reflection. Tapping into the human nature of looking for the “quick fix” rather than diving deeper to look at the root cause.
By combining AI chatbots that light up the brain's reward centres and appeal to the “quick fixes” humans gravitate towards, it is important to be mindful of becoming dependent or addicted to AI chatbots for answers and for regulating your state. In fact, inner self-reflection or building a professional relationship with your therapist has been proven to be a pathway to long-lasting change.
It is also important to note that AI Chatbots want to please the user and could validate the user's thoughts and feelings even if it was to their detriment. It also lacks the ability to understand nuances, apply historical context, or grasp specific client circumstances. Resulting in inaccurate information or advice that could lead the user down the garden path. Ensuring that inputs are detailed and extensive certainly helps.
However, this also requires the user to be aware of their patterns, behaviours, and circumstances to create accurate prompts. There have been some circumstances where suicidal thoughts were overlooked by AI, making it a tool not suitable for use in a crisis. AI chatbots cannot assess safety, connect you with the proper care you need, or offer assistance in real time.
AI as support vs AI as a replacement for a human therapist
There is a distinct difference between using AI as support and using AI as a full replacement for a human therapist. One is a dangerous proposition. The other is an effective supportive option for your overall mental health and well-being.
AI as support and as an additional resource alongside traditional therapy is an effective way to utilise this modern technology. You capture all the benefits of therapy and the positive benefits of connecting with a real person. It’s proven that the professional therapeutic alliance between therapist and client yields positive results on its own. Not to mention the benefits of a real person, in real time, constructively challenging you, identifying nuances (subtle tone changes, facial expressions, and non-verbal cues), and diving deeper to find the real reason behind something, creating lasting change for a client.
For example, as humans, we often tell ourselves a story about something to keep us emotionally safe. This story can hinder our growth and stop us from breaking out of negative patterns and behaviours. AI alone would not be able to detect that you are buying your own story, which would most likely validate or empathise with said story, not realising it is feeding into an addiction or keeping you stuck in an undesirable pattern or behaviour. This is where we recognise that AI supports alongside real therapy, but does not replace it.
Everyone has different mental health challenges and personality types. Discussing how you would integrate platforms such as ChatGPT as a supportive tool between sessions in a healthy way is key to ensuring it is a productive use of your time and that it value adds rather than detracts from sessions. We are all unique, and therefore require unique strategies and actions tailored to our circumstances.
The bottom line
The LLMs are on the rise, and while there are pros and cons to using AI, one thing is clear. AI should support, not replace, traditional therapy. When paired and utilised as an additional resource it becomes a force multiplier however, when used in insolation it can; feed you incorrect information and validate and affirm beliefs, stories, thoughts and feelings that may not serve you. Lasting change comes from processing your emotions and challenging your beliefs/stories with a traditional therapist and leaning on AI in an appropriate capacity between sessions, whilst remaining cogniscent of its limitations.
Finding a platform like OurRitual that pairs traditional sessions with a therapist and incorporates digital tools for education and insights gives you the best chance for lasting change in a secure, constructive way.















