close
close

AI deepfakes that simulate dead people pose the risk of “spitting” relatives, researchers warn

AI simulations of the dead pose the risk of “unwanted digital hauntings,” researchers warn.

A new study by ethicists at the University of Cambridge found that AI chatbots capable of simulating the personalities of deceased people – so-called deadbots – should require safety protocols to protect surviving friends and relatives.

Some chatbot companies already offer their customers the opportunity to use artificial intelligence to simulate the language and personality traits of a deceased relative.

Ethicists at the Leverhulme Center for the Future of Intelligence in Cambridge say such ventures pose a “high risk” because of the psychological impact they can have on people.

“It is critical that digital afterlife services take into account the rights and consents of not only those they recreate, but also those who must interact with the simulations,” said co-author Dr. Tomasz Hollanek from Leverhulme Centre:

“These services risk causing people great distress by subjecting them to unwanted digital tracking by alarmingly accurate AI replicas of their deceased loved ones. “The potential psychological impact could be devastating, particularly at an already difficult time.”

The results were published in the journal philosophy and technology in a study titled “Griefbots, Deadbots, Postmortem Avatars: on Responsible Applications of Generative AI in the Digital Afterlife Industry.”

The study details how AI chatbot companies that claim to be able to bring back the dead could use the technology to spam family members and friends with messages and advertisements that use the digital likeness of the deceased person is used.

Such a result would amount to “stalking by the dead,” the researchers warned.

“Rapid advances in generative AI mean that almost anyone with internet access and some basic knowledge can revive a deceased loved one,” said study co-author Dr. Katarzyna Nowaczyk Basinska.

“This area of ​​AI is an ethical minefield. It is important to prioritize the dignity of the deceased and ensure that this is not compromised by financial motives, for example through digital afterlife services.

“At the same time, a person can leave an AI simulation as a parting gift for loved ones who are not ready to process their grief in this way. The rights of both data donors and those who interact with AI services after death should be equally protected.”

The study’s recommendations include protective measures to eliminate deadbots and improved transparency in the use of the technology.

In the Black Mirror episode “Be Right Back,” AI simulates dead people (Netflix)

Similar to Black mirror In the episode “Be Right Back,” chatbot users are already using the technology to emulate deceased loved ones. In 2021, a man in Canada tried to chat with his late fiancée using an AI tool called Project December that he said mimicked her personality.

“Intellectually, I know it’s not really Jessica,” Joshua Barbeau said The San Francisco Chronicle by the time. “But your feelings are not an intellectual thing.”

In 2022, New York-based artist Michelle Huang fed diary entries from her childhood into an AI language model to have a conversation with her past self.

Ms. Huang told The Independent that it was like “reaching into the past and hacking the temporal paradox”, adding that it felt “very trippy”.