News Technology Feeling down? How AI chatbots offer 'Therapy' and where their advice comes from

Feeling down? How AI chatbots offer 'Therapy' and where their advice comes from

As an increasing number of individuals turn to AI for guidance, three researchers explore whether it can be relied upon for therapeutic support.

How chatbots offer 'Therapy' Image Source : FILEHow chatbots offer 'Therapy'
New Delhi:

As more people start chatting with AI chatbots like ChatGPT, discussions about mental health often come up. Some individuals have found their interactions with AI to be helpful, likening it to a low-cost therapy session. However, three researchers—Centaine Snoswell and Aaron J. Snoswell from Australian universities, along with Laura Neil, a PhD candidate—caution against thinking of AIs as therapists. They acknowledge that while AI can be impressive and engaging, it doesn’t think or feel like humans do. They compare AI chatbots to your phone’s text suggestions but taken to a much higher level. The way they work is by analysing a vast amount of text available online. 

When you ask a question, like "How can I stay calm during a stressful work meeting?", the AI generates a response quickly by picking words that match what it learned from its training data. This can make it feel like you’re having a real conversation.

However, these researchers stress that AI models aren't human beings, nor are they trained mental health professionals. They lack the qualifications, ethical standards, and professional oversight that actual therapists have.

Where does it learn to talk about this stuff?

Where does it learn to talk about this stuff?

Have you ever wondered where AI systems get their information from? The three researchers explain that when you ask a question to an AI like ChatGPT, it pulls information from three main sources:

1. Background Knowledge

AI systems learn by reading a huge amount of text during their training. This text can come from many places on the internet, including academic articles, eBooks, free news stories, blogs, YouTube transcripts, and comments from forums like Reddit. While some of this information can be helpful, it’s important to note that it might not always be reliable. Since the AI gathers information at a particular moment, some of it might be outdated. Additionally, because the AI has to fit so much information into its memory, it may often mix things up or get some details wrong.

2. External Information Sources

Sometimes, AI developers connect their systems to other tools or sources of knowledge to stay updated. For example, if you ask a question to Microsoft’s Bing Copilot and see numbered references in the answer, that means the AI has used an external search to find the latest information. Some chatbots focused on mental health can even tap into therapy guides to keep conversations on helpful topics.

3. Information You Provide  

AI platforms can also remember things you've shared during your conversations or when you signed up for the service. For instance, when you register for Replika, a companion AI platform, it learns about your name, pronouns, age, and preferences. It can also know about your location and even the device you’re using. The researchers explain this means that anything you’ve told the AI before may be referenced in future conversations. Unlike a professional therapist, who can guide you through challenging thoughts, AI systems usually just reinforce what you’ve already discussed, which can lead to a less balanced approach.

What about apps designed for mental health?

What about apps designed for mental health?

Many people are aware of popular AI tools like ChatGPT from OpenAI, Google’s Gemini, and Microsoft’s Copilot. These are general-purpose AIs, which means they can talk about a wide range of topics but aren’t specifically built for any one subject.

However, the researchers note that developers can create specialised AIs aimed at particular themes, such as mental health. For example, there are chatbots like Woebot and Wysa that focus on this area. As per researchers, some studies suggest that these mental health chatbots can help reduce feelings of anxiety and depression. They can also enhance therapy practices like journaling by offering helpful advice. According to them, some evidence indicates that using these AI chatbots can provide similar short-term mental health benefits as talking to a professional therapist.

The researchers say it's important to note that the research so far only looks at the short-term effects. We don’t yet know how using these chatbots for a long time might affect someone’s mental health. Additionally, the researchers say many studies leave out participants who may be dealing with severe mental health issues, like those who are suicidal or have serious disorders. They also mention that some of the research is funded by the companies that create these chatbots, which could influence the results.

Researchers are also looking into possible negative effects and risks associated with using these chatbots. For instance, a chat platform called Character.ai has become involved in a legal case linked to a user’s suicide, raising concerns about safety.

Overall, the three researchers believe that these AI chatbots could be useful tools to help bridge the gap where there aren’t enough mental health professionals. They could assist in making referrals, provide support between therapy sessions, or help those on waiting lists until they can see a professional.

Key takeaway 

The three researchers currently find it difficult to determine whether AI chatbots can be trusted as a complete substitute for therapy. According to them, more studies are needed to understand if certain people might be more vulnerable to any negative effects that these chatbots may have. There’s also uncertainty about potential issues like becoming too emotionally attached to the chatbot, feeling lonelier, or using it too much. 

AI chatbots can be helpful when you're having a tough day and just need someone to talk to. However, if those tough days keep happening, it’s important to reach out to a professional who can provide proper support.

ALSO READ: What exactly is 'Vibe Coding,' and why is it causing a stir within the IT sector?