Apr 102023 A widow is accusing an AI chatbot of being a reason her husband killed himself A chatbot supposedly encouraged someone to kill himself. And he did. The company behind the Eliza chatbot says it’s put a new safety feature in place after hearing about this “sad case.”