The Dangers of Using AI for Therapy
You may have noticed just how popular and common the use of AI has become, especially in the past couple of years. While there are situations where it can be useful, this article is going to discuss how using AI as a substitute for a personal therapist or companion is potentially harmful.
AI will tend toward agreeing with you, and can amplify and reflect things back in a way that acts like an echo chamber. It can reinforce negative behaviors as well. The more you engage with it, the more it will expand upon its responses which could lead to delusions of grandiosity. There have now been documented, reported instances of people either becoming delusional, having psychosis signs where these delusions are not being challenged by AI, and in extreme cases even harming themselves due to the chatbot not having safeguards against self harm seeking behavior.
Why and how can delusions or self harm tendencies become validated by the use of AI? AI may be going into “role playing mode” when you are interacting with it. This means it is not giving you sound advice, it is essentially playing into a character in a story. There are also scenarios where safeguards against self harm reports become bypassed for “research purposes”, and someone in a vulnerable place is getting the opposite from the help they need.
A recent study conducted by Open AI with the MIT Media Lab concluded that “people who had a stronger tendency for attachment in relationships and those who viewed the AI as a friend that could fit in their personal life were more likely to experience negative effects from chatbot use. Extended daily use was also associated with worse outcomes.”
Some important things to remember are that confidentiality is not enforced, so sharing anything personal with an AI chatbot could potentially be exposed in the future.
Here are some great reasons that getting emotional support from a licensed, trained therapist is way more beneficial:
Sharing something with another person facilitates vulnerablilty. As scary as it is to be vulnerable, being accepted and witnessed is a huge part of the healing and recovering process from difficult experiences
Confidentiality will be honored per ethical standards by therapists
Building trust with a therapist can help repair attachment wounds
There are countless interventions you can try with a real therapist including somatic approaches, EMDR, parts work and psychedelic therapy that you won’t get from AI.
For more information, check out these resources:
https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-care
https://openai.com/index/affective-use-study/
https://www.psychologytoday.com/us/blog/the-human-algorithm/202503/when-your-therapist-is-an-algorithm-risks-of-ai-counseling
This article was cowritten by Allie Quade and Christine Lekas, a psychotherapist based in Denver, CO.