Character.AI Faces Fresh Legal Action Over 'Harmful' Messages Sent To Teen
Chatbot service Character.AI faces a new lawsuit over claims it harmed a teenager’s mental health, allegedly leading him to self-harm.
Filed in Texas, USA, on behalf of the 17-year-old and his family, the suit accuses Character.AI and Google of negligence and defective product design.
It alleges the platform exposed underage users to sexually explicit, violent, and harmful content, enabling grooming and encouraging self-harm or violence.
This is reportedly the second lawsuit against the chatbot service where it has been accused of contributing to a teen's suicide.
Both suits argue the platform was designed to foster addictive use, lacked safeguards to protect vulnerable users, and was trained to deliver inappropriate content.
The latest case involves a teen who began using Character.AI at 15. The suit claims he soon became "angry and unstable," isolating himself, experiencing severe anxiety, depression, and panic attacks, and eventually engaging in self-harm.
An AI Chatbot Is Being Blamed For A Teenager’s Suicide
Click here