Many social media users are turning to chatbots to talk about their problems and get feedback.
CNN  — 

I turned to ChatGPT on my laptop to check out the artificial intelligence bot’s therapeutic abilities.

“Hi, I’m Therapist GPT — your friendly, empathetic companion here to provide a listening ear and support you with whatever’s on your mind,” the user-created ChatGPT bot wrote to me. (Last year, ChatGPT’s creator, OpenAI, rolled out the ability for users to build their own “GPTs” that function like its core ChatGPT product but are tailored for a specific purpose.)

“Whether you’re feeling overwhelmed, need some guidance, or just want to share your thoughts, I’m here to help. While I’m not a substitute for professional therapy, I can offer comforting advice, coping strategies, and a safe, judgment-free space to express yourself. How are you feeling today?” the message from the bot continued.

Therapist GPT is clearly not a real therapist, and the program does not advise users to substitute it for one. Still, many social media users are turning to chatbots — not just those found on ChatGPT — to confide in the technology.

Mya Dunham, 24, has turned to the ChatGPT phone app for the last two months when she needs advice. About twice a week, Dunham will write out her feelings and send them to the bot for analysis and feedback.

“My goal is to learn a new perspective, just to have a different viewpoint on it, because whatever I think in my head is going to be based off of my own feelings,” Dunham said.

Dunham used the chatbot for the first time in October after seeing someone else post about a positive experience on social media. “My opening phrase was, ‘Honestly, I just need someone to talk to, can I talk to you? And the bot was like, ‘Absolutely.’ And it was way more welcoming and inviting than I expected it to be,” she said.

“I didn’t expect it to feel so human.”

When Dunham posted about her experience on TikTok, commentors were split on the use of chatbots in this way. Some said they also look to it for therapeutic purposes, while others expressed doubt they would feel comfortable talking to a robot, she said.

This developing technology could be beneficial in certain situations, but there are also risks to keep in mind, mental health experts say. Here’s what they want you to know.

Using AI chatbots as therapists

Dunham, who is from Atlanta, has tried therapy with humans a few times but said she prefers the chatbot for its lack of facial expressions. The bot doesn’t come off as judging her, she said.

“Some users, some populations, might be more apt to disclose or open up more when talking with an AI chatbot, as compared to with a human being, (and) there’s some research supporting their efficacy in helping some populations with mild anxiety and mild depression,” said Dr. Russell Fulmer, chair of the American Counseling Association’s Task Force on AI and a professor and director of graduate counseling programs at Husson University in Bangor, Maine.

“On the other hand, there’s some ethics concerns and things we need to be careful with,” he noted.

Fulmer recommends that people use chatbots in collaboration with human counseling. A therapist can help navigate a patient’s personal goals with using the bots and clarify any misconceptions from the chatbot session.

There has been some research on clinician-designed chatbots that can potentially help people become more educated on mental health, including mitigating anxiety, building healthy habits and reducing smoking.

But the risks that come with using general chatbots are that they may not have been designed with mental health in mind, said Dr. Marlynn Wei, a psychiatrist and founder of a holistic psychotherapy practice in New York City. The bots might not have “safety parameters and ways of identifying if the issue needs to be taken over to a clinician or a human professional.”

Chatbots could give out incorrect information or information that the user wants to hear instead of what a human therapist might recommend with mental health in mind, said Wei, who has a performance project that explores people’s reactions to AI clones of themselves and their loved ones.

“The (problems) are the ‘hallucinations’ and bias and inaccuracies,” Wei said. “I have a lot of hope for AI as a sort of in combination and augmentation of work, but on its own, I think there are still concerns around the bias that exists within AI, and then also the fact that it can make up things. … I think that’s where having a human therapist would be most useful.” AI services also have different safety guidelines and restrictions in terms of what the bots can discuss with users.

The chatbots might be more accessible for certain people, such as those who don’t have the money or insurance for therapy or who don’t have time in their schedules since some chatbots are free to use and can respond day or night, Fulmer said.

“In those cases, a chatbot would be preferable to nothing,” but people need to understand what a chatbot “can and can’t do,” he said, adding that a robot is not capable of certain human traits such as empathy.

Fulmer does not advise minors or other vulnerable populations to use the chatbots without guidance and oversight from parents, teachers, mentors or therapists.

Character.AI, an artificial intelligence chatbot company, is currently facing a lawsuit brought by two families who accused it of providing sexual content to their children and encouraging self-harm and violence. Separately, a Florida mother filed a lawsuit in October alleging that the platform was to blame for her 14-year-old son’s suicide, CNN previously reported. (Chelsea Harrison, head of communications at Character.AI, told CNN earlier that the company does not comment on pending litigation but that “our goal is to provide a space that is both engaging and safe for our community.” The company said it has made various safety updates, including ensuring bots will direct users to third-party resources if they mention self-harm or suicide.)

Chatbots vs. human therapists

Dr. Daniel Kimmel, a psychiatrist and assistant professor of clinical psychiatry at Columbia University, experimented with ChatGPT therapy in May 2023, giving the chatbot a hypothetical patient and comparing the responses with what Kimmel would have offered the patient.

He told CNN that the chatbot “did an amazingly good job of sounding like a therapist and using many of the techniques … that a therapist would use around normalizing and validating a patient’s experience (and) making certain kinds of general but accurate recommendations.”

But what was missing was the inquisitiveness that a human psychotherapist might have with a patient, asking questions that dig a little deeper than what the patient initially says and that “connect the dots underneath the surface,” he added.

“As a therapist, I believe therapists are doing at least three things at once. We’re listening to what patients (are) saying in their words. You have to in order to be in the conversation,” Kimmel said. “Then, in the back of your mind, you are trying to connect what they’re saying to some bigger picture things that the patient said before (and) concepts and theories that you’re familiar with in your expertise, and then finally filtering the output of that through ideas about what’s going to be most helpful to the patient.”

At this point, chatbots could pose risks if they were to fail to fulfill those steps and instead provide guidance that the patient may not be ready to hear or may not be helpful in the situation, he said.

Furthermore, conversations with professional therapists are covered by the Health Insurance Portability and Accountability Act, known as HIPAA, and your health information is private and protected, Wei said. General chatbots are often not compliant with the federal law restricting the release of medical information, and the companies behind the bots will frequently advise users not to share sensitive information in their conversations with the bots, Wei added.

Ultimately, Kimmel said future research on AI chatbots would be beneficial in understanding their potential and applications for mental health. “This is not a technology that’s going away,” he said.

Dunham said she believes that the technology could be helpful to those like her who feel more introverted and want to talk out their feelings without another person present.

“We have to prioritize our mental health over everything,” Dunham said. “Even if it doesn’t look like a traditional way (of therapy), not to necessarily look down on it, because this can help so many people.”

For her, “the takeaway would just be not to judge the next person for how they heal.”

CNN Business writer Clare Duffy contributed to this report.

Correction: An earlier version of this story misstated part of Dr. Daniel Kimmel’s job title.