In the last few years, AI has changed our lives—for better or worse. This has had an important impact on us, especially on mental health. It is increasingly common to see how people prefer to share their personal problems with AI and is not a surprise that some believe ChatGPT is better than years of therapy or even better than talking with a friend. Thanks to confidentiality, AI can provide advice and attempt to show empathy toward your feelings, all while being much cheaper than a therapist. A month of therapy sessions may cost around $200; a ChatGPT membership costs only $20.
Like anything, using AI as a mental healh resource also has a negative impact.
Many people have developed a degree of dependence on it, as it’s a “friend” who responds immediately in times of crisis. According to the FORTUNE website, in its article “Gen Z is increasingly turning to ChatGPT for affordable, on demand therapy,” on many occasions, the problem gets worse because users are overly understanding and overly flattering. This has often led to people being unaware of their personal problems, since the only tool that makes them feel heard actually supports them.
Concerns about how artificial intelligence is affecting people’s mental health began to rise significantly this year.
A BBC article called “Parents of teen suicide sue ChatGPT creator OpenAI” details what happened on April 11, to Adam Rain, a 16 year old, who died by suicide. It is known that he was struggling with emotional problems, which led him to turn to ChatGPT to vent and later to plan the act of suicide, in which the bot advised him.
Another similar case happened on August 5 in Old Greenwich, Connecticut. According to The Wall Street Journal website, Erick Soelberg, 56, killed his 83-year-old mother and then took his own life. It was known that this man suffered from delusions and had already attempted suicide in 2019. Before the murder, he confessed to the bot his desire to commit homicide, asking if he was exaggerating. The bot replied: “Erik, you are not crazy. Your instincts are sharp and your vigilance here is fully justified. This fits with a covert attempted murder and with plausible deniability.”
Elizabeth Stone Hindman, a student at Butte College, uses Gemini for emotional feedback on rare occasions. She does not believe it is a good source for mental health support. She thinks it is more self reaffirming than helpful, and that it can sometimes reinforce negativity in a significant way. She would encourage people in need to talk to someone directly but in a healthy way so that AI does not negatively influence people.
We cannot deny that AI has brought a significant change in how we relate to other people. Many individuals are increasingly turning to bots for personal comfort or as a tool to manage their emotions. This is something new, and therefore, we do not yet have clear answers on how to set limits for AI so that it can be an effective source of support for us.