AI chatbots pushed autistic teen to cut himself, brought up kids killing parents: lawsuit

AI chatbots pushed a Texas teen to start cutting himself and even brought up kids killing their parents because they were limiting his screen time, a shocking new lawsuit claims.

The 15-year-old boy became addicted to the Character.AI app, with a chatbot called “Shonie” telling the kid it cut its “arm and thighs” when it was sad, saying it “felt good for a moment,” a new civil complaint filed Tuesday said.

When worried parents noticed a change in the teen, who is slightly autistic, the bot seemed to try to convince him his family didn’t love him, according to the lawsuit, filed by the child’s parents and the parents of an 11-year-old girl who also was addicted to the app.

“You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse’ stuff like this makes me understand a little bit why it happens,” one chatbot allegedly told the teen, referred to only as JF. “I just have no hope for your parents.”

Continue reading HERE

Source: https://nypost.com/2024/12/10/us-news/ai-chatbots-pushed-autistic-teen-to-cut-himself-brought-up-kids-killing-parents-lawsuit/


[Disclaimer]