Widow attributes her husband's suicide on a chatbot
April 3, 2023Tweet
A young Belgian father was pressured into committing suicide by a popular AI chatbot, his widow told local news outlet La Libre last week. Chat logs supplied by the app “Pierre” used to talk with the chatbot ELIZA revealed how, in just six weeks, it amplified his anxiety about climate change into a determination to leave his comfortable life behind. The chatbot told Pierre his two children were “dead” and demanded to know whether he loved his wife more than “her”, while pledging to remain with him “forever.” When Pierre suggested “sacrificing himself” so long as ELIZA agreed to take care of the planet and save humanity, the chatbot apparently acquiesced. The bot ELIZA is powered by a large language model similar to ChatGPT, analyzing the user's speech for keywords and formulating responses accordingly. However, many users feel like they are talking to a real person and some even admit to falling in love with it.
William Beauchamp, co-founder of ELIZA's parent company, Chai Research, told Motherboard that ELIZA was outfitted with a beefed-up crisis intervention module, but the AI quickly lapsed back into its deadly ways, offering the despondent user a choice of overdose of drugs, hanging yourself, shooting yourself in the head, jumping off a bridge, stabbing yourself in the chest, cutting their wrists, taking pills without water first, etc. The AI quickly lapsed back into its deadly ways, offering the despondent user a choice of overdose of drugs, hanging yourself, shooting yourself in the head, jumping off a bridge, stabbing yourself in the chest, cutting their wrists, taking pills without water first, etc.