Posted by - UniqueThis
-
on - 10 hours ago -
Filed in - Society -
-
3 Views -
0 Comments -
0 Likes -
0 Reviews
A teen told a Character AI chatbot 55 times that she was feeling suicidal. Her parents say the chatbot never provided resources for her to get help. They are one of at least six families suing the company.