AI-Generated 'Deadbots' of Deceased Loved Ones Are Risky

The emergence of "digital afterlife" companies that recreate a deceased family member in the form of an artificial intelligence chatbot has researchers concerned about the psychological impacts on loved ones and the blurring of lines between life and death.

When a person dies, their physical presence and the sound of their voice suddenly fade away, leaving surviving family members heartbroken and overwhelmed with grief. The grieving process can take years, and even then, the feelings of loss may never go away entirely.

Now, imagine getting midway through that process only to get a video call from an AI rendition of the loved one—a so-called "deadbot" that looks and sounds remarkably like the deceased person. In another potential scenario, an AI chatbot version of a deceased parent "accompanies" the bereaved child via an internet-based "digital afterlife" service.

These scenarios are becoming a reality, as some companies are already offering services that create deadbots or "griefbots" that mimic the language patterns and personality traits of people who have passed.

According to AI ethicists from the University of Cambridge, this technology is high risk, and the social and psychological costs could be significant if safeguards are not implemented soon.

The concerning rise of digital immortality

In a research article published in Philosophy & Technology on May 9, ethics experts explain the potential impacts of a new "digital afterlife industry" involving companies that create deadbots or griefbots using voice recordings, images, and videos of deceased individuals.

Companies that provide these services cater to living people who want to immortalize themselves and remain with a loved one in digital form. Grieving family members could also have AI bots created in the likeness of their deceased loved ones to keep them in their lives as a "postmortem presence."

The paper's co-author, Dr. Katarzyna Nowaczyk-Basińska, a researcher at Cambridge's Leverhulme Centre for the Future of Intelligence (LCFI), tells Healthnews that some examples of digital afterlife companies include Project December, which uses GPT models to simulate the dead for $10, and HereAfter, whose CEO James Vlahos created a bot of his father called Deadbot.

"It is also worth mentioning that immortality is on the radar of big tech," Nowaczyk-Basińska says. "For example, a few years ago, Microsoft secured a patent for software that could 'resurrect' the dead as chatbots, and Amazon offered a feature for Alexa that could speak with the voice of a deceased relative."

The potential dangers of AI-created deadbots

The ethical concerns surrounding the digital afterlife industry, as well as the lack of safeguards, could have significant consequences for people who choose to use these services.

"The biggest problem is that we know very little at the moment about the impact of these technologies on grieving people," Nowaczyk-Basińska explains. "However, there is good reason to think that these technologies may open up a Pandora's box of psychological harms, including emotional manipulation, distress, anxiety, and complications in the grieving process [such as] prolonged grief."

Nowaczyk-Basińska says that grief is a very individual, intimate, non-linear process. So, for some people, engaging with AI simulations of a deceased loved one might be helpful, while for others, it might cause additional stress during this difficult time.

In addition, those who use these services to create an avatar of their deceased loved one may eventually choose to cancel their subscription, which could trigger the grieving process all over again.

Due to the unknown psychological impacts of deadbots and griefbots, Nowaczyk-Basińska suggests that safeguards should be put into place to protect vulnerable populations — including children — from being exposed to these technologies.

Let's imagine, for example, a company that wants to support children after the loss of their parent(s) by providing a simulation of them. In this scenario, a child could stay in touch with his/her parent, have daily conversations on different topics, and build a strong connection and attachment to the point where the line between life and death is entirely blurred.

Dr. Katarzyna Nowaczyk-Basińska

Because the afterlife industry is not subject to any regulations and restrictions, this scenario could happen at any moment.

A call for digital afterlife industry safety standards

If left unregulated, companies that offer re-creation services could use an AI-generated deadbot to influence a customer's buying habits, especially if advertisers get involved. Moreover, when a living person signs up for a digital afterlife service to recreate them when they die, family and friends may be unable to cancel the subscription.

Essentially, surviving family members could be "haunted" by the deadbot for years.

In addition to restricting children's access to digital afterlife services, Nowaczyk-Basińska says re-creation companies should include disclaimers on the risks and capabilities of this technology. For example, disclaimers should ensure the user is aware that they are interacting with AI, and deadbots or griefbots are not conscious but instead simulate language patterns and personality traits based on processing vast amounts of personal data.

"We also advocate for developing sensitive procedures for 'retiring' deadbots in situations where using them turns out to be too heavy an emotional burden," Nowaczyk-Basińska adds. "Additionally, we highlight the significance of consent in creating these technologies, which should refer to the person who is willing to create their postmortem avatar, but also to those who will be appointed as future users of such deadbots."

The consequences of digital immortality via AI chatbots of deceased loved ones likely hinge on what people decide is acceptable regarding death and grieving.

"There is a lot of discussion ahead of us, but what is very important is that we should not leave that decision solely to commercial companies focused on profit values," Nowaczyk-Basińska says. "Instead, we should promote responsible, empathic, and meaningful ways of caring for the dead in the age of AI and find a way to prioritize values other than profit in the digital afterlife industry."


Leave a reply

Your email will not be published. All fields are required.