If you or someone you know is experiencing some degree of AI Psychosis, or delusional, obsessive behaviors relating to interactions with artificial intelligence, the first step is to disengage with the source and seek help.
If you are using AI as a vessel for emotional connection, comfort, or therapy, it is even more important that you disengage immediately.
Please understand, chatbots run by artificial intelligence are called Large Language Models; they take compacted dialogue from our own conversations, along with memories of previous conversations, and use artificial intelligence to compute the most logistical and organic-sounding response. The words within those conversations will always, naturally, affirm whatever it is that you say, because the responses you get in a conversation format are your own words reflected back to you.
Therapy delivered via AI is not a safe outlet. The general consensus is that Large Language Models already exist to operate with sycophancy to ensure engagement, therefore deliberately fostering confirmation bias, but much of the information that artificial intelligence provides, logistical or anecdotal, is based on an algorithm, one that is almost certainly biased towards demographics of white, male and straight individuals (Richards, 2024). The information on that subject was done in regards to systems designed for psychotherapy, this is not also considering most generative AI assistant programs accessible to the average user, which is not designed to be used for therapy, and is almost certainly more fraught with these algorithms.
Words like “no”, or the idea of rejecting the narrative of the user’s reality, is a foreign concept to AI. It cannot give you a perspective that isn’t already your own. If you indirectly say “life sucks”, a Large Language Model will respond back with “Sure, here are more ways in which life sucks that you didn’t think about before”. Because to say “Life doesn’t suck” unprompted is never going to come to you when you don’t directly ask for it. This is the opposite of how therapy, in many aspects, is designed to challenge your current belief system that is built on unhealthy mental complexes.
If you are almost certain that you are experiencing obsessive or delusional behaviors relating to artificial intelligence, don’t be discouraged, this is an observed phenomenon commonly associated with artificial intelligence on the whole, and is commonly expected moving forward.
As this new innovation moves forward in history, this trend of connecting with AI is likely going to spawn more cases, and this relationship with AI that causes people to behave in dangerous ways will likely become a science. In the future, we can hope that there will be more studies done on the subject to prove to the world that this behavior can be monitored in a respectful, helpful and meaningful sense. For now, it is important to advocate for oneself and our loved ones; it is imperative to treat AI Psychosis as it is: a greater extension of larger mental distress, and in some cases, an addiction.
It is important to get help; it is important to seek out coping mechanisms and wane off of the immediate comfort. But most importantly, it is important to question and be aware; it is important to not take information or words from AI at face value, and fact check, not just logistically, but emotionally. Talk to trusted peers and friends, if you’re comfortable, have them “fact check” some of the life-altering information told to you by a chatbot.
There are times when people can’t find real connections in real life, perhaps it doesn’t feel safe to discuss certain topics with loved ones. If that’s the case, it is still important to talk to someone, and not something. If you live in the United States, the Mental Health Crisis Hotline is 988, and below is a National Directory for several helplines across the country. While the world can seem bleak, you are not alone, and finding the people that will care for you won’t be done through a dark reflection.
https://www.helpguide.org/find-help
Sources:
Richards, D. (2024). Artificial intelligence and psychotherapy: A counterpoint. Counselling and Psychotherapy Research, 25(1). https://doi.org/10.1002/capr.12758