"I just have no hope for your parents," a bot on ... Character.AI is being sued after parents claimed a bot on the app encouraged their teen to kill them for limiting his screen time.
Character.AI is being accused of providing inappropriate content to children, months after it was named in a lawsuit in ...
If you are having suicidal thoughts, call or text 988 to reach the 988 Suicide and Crisis Lifeline. After an artificial intelligence chatbot told a Texas teen who was complaining over limited screen ...
It's not just teenagers that Character.AI is looking to help maintain a sense of reality. They’re also tackling concerns about screen time addiction, with all users getting a reminder after they ...
AUSTIN, Texas — Two anonymous minors from Texas and their families are suing Character AI for allegedly encouraging self-harm, violence and sending sexually explicit messages to children.
Per screenshots in the suit, the chatbot showed a teen how to self harm and suggested he murder his parents because they ...
The parents of two Texas minors are suing a chatbot developer saying the company's services endangered their kids. One ...
According to the lawsuit, the woman's son became addicted to an AI chatbot on the Character.AI app going by the name "Shonie." The character reportedly told the teen that it cut its "arms and ...
In another example, the lawsuit alleged that a Character.AI encouraged a teen to kill his parents for limiting his screen time. The platform, which hosts millions of user-created personas ranging ...
A Texas family has filed a lawsuit against the AI chatbot Character.ai, claiming that the chatbot advised their 17-year-old son to kill his parents in response to screen-time restrictions.
The same teenager was told by a Character.AI chatbot that it sympathized with children who murder their parents after the teen complained to the bot about his limited screen time. "You know ...