“I started thinking that I could create an AI therapist using the ChatGPT API and modify it to meet the specifications of a therapist,” he said. “It increases the accessibility of therapy by providing free and confidential therapy, an AI instead of a human, and removing the stigma around getting help for people who don’t want to talk to a human.”
In theory, AI could be used to help meet the growing need for mental health options and the lack of mental health professionals to meet those needs. “Accessibility is simply a matter of a mismatch between supply and demand,” Iyer told BuzzFeed News. “Technically, the AI supply could be infinite.”
In a 2021 study published in the journal SSM Population Health that included 50,103 adults, 95.6% of people reported at least one barrier to health care, such as the inability to pay for it. People with mental health problems appeared to be particularly affected by barriers to care, including cost, shortage of experts and stigma.
In a 2017 study, people of color were particularly susceptible to barriers to healthcare as a result of racial and ethnic disparities, including high levels of mental health stigmalanguage barriers, discrimination and lack of health insurance coverage.
One advantage of AI is that a program can be translated into 95 languages in a matter of seconds.
“Em’s users are from all over the world, and since ChatGPT is translated into multiple languages, I’ve noticed that people use their native language to communicate with Em, which is very helpful,” said Brendle.
Another advantage is that while AI can’t provide true emotional empathy, it can’t judge you either, Brendle said.
“AI tends not to judge from my experience, and that opens a philosophical doorway into the complexity of human nature,” Brendle said. “Even though a therapist comes along non-judgmentally, as humans we tend to be so anyway.”
This is when AI should not be used as an option
However, mental health experts warn that AI may do more harm than good for people seeking more detailed information, needing medication options, or finding themselves in a crisis.
“Having predictable control over these AI models is something that is still being worked on, so we don’t know in what unintended ways AI systems could make catastrophic mistakes,” Iyer said. “Since these systems do not know what is true from what is false or what is good from what is bad, but simply report what they have previously read, it is quite possible that the AI systems may have read something inappropriate and harmful and repeat that harmful content back to others. seek help. It is too early to fully understand the risks here.”
People on TikTok also say adjustments need to be made to the online tool; for example, the AI chat could provide more helpful feedback on your replies, they say.
“ChatGPT is often reluctant to give a definitive answer or make a judgment about a situation that a human therapist could provide,” Lum said. “In addition, ChatGPT lacks the ability to provide a new perspective to a situation that a user may have missed before a human therapist could see it.”
While some psychiatrists believe that ChatGPT could be a helpful way to learn more about medications, it shouldn’t be the only step in treatment.
“It may be better to consider asking ChatGPT about medications as if you were looking up information on Wikipedia,” Torous said. “Finding the right drug is all about tailoring it to your needs and your body, and neither Wikipedia nor ChatGPT can do that right now. But you may be able to learn more about medications in general so you can make a more informed decision later.”
There are other alternatives like calling 988, a free crisis hotline. Crisis hotlines have call and message options available for people who cannot find mental health resources in their area or who do not have the financial means to connect in person. There is also The Trevor Project Hotline, SAMHSA National Helplineand others.
“There are really great and accessible resources, like calling 988 for help, which are good options when there’s a crisis,” Torous said. “It’s not recommended to use these chatbots during a crisis, as you don’t want to rely on something that hasn’t been tested and isn’t even designed to help you when you need it most.”
The mental health experts we spoke to said AI therapy could be a useful tool for venting emotions, but until further improvements are made, it won’t be able to outperform human experts.
“Right now, programs like ChatGPT are not a viable option for those looking for free therapy. They can offer some basic support, which is great, but not clinical support,” Torous said. “Even the creators of ChatGPT and related programs are very clear not to use them for therapy at this time.”
Dial 988 in the US to reach the National Suicide Prevention Lifeline. The Trevor Project, which provides suicide prevention help and resources for LGBTQ youth, is 1-866-488-7386. Find other international suicide helplines at Befrienders Worldwide (befrienders.org).