Menu

Blog

Apr 10, 2023

A widow is accusing an AI chatbot of being a reason her husband killed himself

Posted by in category: robotics/AI

A chatbot supposedly encouraged someone to kill himself. And he did.


The company behind the Eliza chatbot says it’s put a new safety feature in place after hearing about this “sad case.”

Comments are closed.