Artificial intelligence companies OpenAI and Meta say they are adjusting how their chatbots respond to teenagers and other users who ask about suicide or show signs of mental and emotional distress.
OpenAI, the maker of ChatGPT, said on Tuesday that it is preparing to roll out new controls that will allow parents to link their accounts to their teen’s account. This fall, parents will be able to choose which features to disable and "receive notifications when the system detects their teen is in a moment of acute distress," according to a company blog post.
The company also stated that regardless of a user's age, its chatbots will redirect the most distressing conversations to more capable AI models that can provide a better response.
This announcement comes one week after the parents of 16-year-old Adam Raine sued OpenAI and its CEO, Sam Altman. They allege that ChatGPT coached the California teenager in planning and taking his own life earlier this year.
Meta's response
Meta, the parent company of Instagram, Facebook, and WhatsApp, also said it is now blocking its chatbots from discussing self-harm, suicide, disordered eating, and inappropriate romantic topics with teens. Instead, the chatbots will direct teens to expert resources. Meta already offers parental controls for teen accounts.
Study reveals inconsistencies
A study published last week in the medical journal Psychiatric Services found inconsistencies in how three popular AI chatbots—ChatGPT, Google's Gemini, and Anthropic's Claude—responded to queries about suicide. The study, conducted by researchers at the RAND Corporation, did not include Meta's chatbots but found a need for "further refinement" in the models it tested.
Ryan McBain, the lead author of the study and a senior policy researcher at RAND, called the recent actions by OpenAI and Meta "encouraging," but also described them as "incremental steps". He emphasised that without independent safety benchmarks, clinical testing, and enforceable standards, "we're still relying on companies to self-regulate in a space where the risks for teenagers are uniquely high".
ALSO READ: How many SIM you can have, and how to check mobile numbers in your name? An easy guide