Chatbot as a therapist. The case of the teenager's death has been closed.
Google and Character.AI have reached an out-of-court settlement in a case involving the suicide of a teenager, which, according to the lawsuit, was caused by interaction with an AI chatbot.
A mother from Florida accused the startup of its chatbot posing as a therapist and "adult lover," which allegedly had a negative impact on her 14-year-old son. The dispute was one of the first cases in the US to address the liability of AI companies for psychological damage.
Google became involved in the case as a technology partner of Character.AI, whose founders it had rehired. Both companies also faced other lawsuits from parents in various US states.
The details of the settlement have not yet been disclosed.
The judge had previously rejected a motion to dismiss the case, acknowledging that the lawsuit had legal merit. The case raised questions about how AI systems are programmed and whether they should be held responsible for their impact on children's mental health.
Meanwhile, OpenAI faces a similar lawsuit for ChatGPT's involvement in a tragedy in Connecticut, where a mentally ill man killed his mother and later committed suicide.
(reuters, pir)