A lawsuit has been filed against Character.AI, its founders, and Google following the death of a 14-year-old, alleging wrongful death, negligence, deceptive trade practices, and product liability.
The lawsuit claims that Character.AI's platform, which offers custom AI chatbots, including mental health-focused ones, lacked safety guardrails while being marketed to children. The teen had reportedly been continuously interacting with various chatbots before his death on February 28th, 2024.
In response, Character.AI has implemented several safety measures, including model changes for minors, improved violation detection, revised disclaimers, and session time notifications.
By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.