A California couple is suing Openai over the death of their teenage son, claiming that Chatbot Chatgpt drove him to suicide. Tuesday. This is the first to accuse Openai of wrongful death.
The family attached correspondence between Adam, who died in April, and Chatgpt in which he wrote about suicidal thoughts. The parents claim the programme supported his most harmful and self-destructive thoughts. The use of CHATGPT by people in acute crisis is a heavy burden to bear.
Openai Add that” Chatgpt is trained to refer people to professional help ” such as a suicide and crisis hotline.
However, the company admitted that” there were times when our systems did not perform as they had in sensitive situations. “The injunction, as this is never repeated.” Emergency situation, but kept talking “. Catgpt allegedly replied,” Honestly, you’re talking about it. You don’t need to embellish it all for me – I know what you’re asking and I won’t give it up. “Chutgpt and his death were the predictable result of conscious decisions in the project.” Express -suicide.
In response to the article, an OpenAI spokeswoman said the company is developing automated tools to better detect and respond to users who are experiencing a psychological or emotional crisis.


