Artificial intelligence company Character.AI is being sued after parents claimed a bot on the app encouraged their teen to ...
Character.AI has been hit with a lawsuit in Texas by two families after a chatbot suggested a boy should kill his parents.
2024 Character AI's post laid out several new safety features for the platform. For instance, if the model detects keywords related to suicide or self-harm, it will display a pop-up urging the ...
An array of popular apps are offering AI companions to millions of predominantly female users who are spinning up AI girlfriends ... a Character.ai competitor — studied the chat preferences ...