Posted on Oct 28, 2024
Family of 14-year-old who committed suicide blame 'addictive' chatbot in lawsuit | Blaze Media
304
6
2
3
3
0
Young teen commits suicide after an online AI chatbot that he thought he loved, told him, “I love you too. Please come home to me as soon as possible, my love.” This story a good warning for parents to be alert to children's’ online activities.
Family of 14-year-old who committed suicide blame 'addictive' chatbot in lawsuit | Blaze Media
Posted from theblaze.com
Edited 17 d ago
Posted 17 d ago
Responses: 1
Posted 17 d ago
It's not safe to use AI for counseling/responding to psychological situations.
(3)
Comment
(0)
Sgt (Join to see)
17 d
Or looks like kids can access AI bots for a friend/ loved one
:((. That won’t end well.
:((. That won’t end well.
(0)
Reply
(0)
Read This Next