Three human skulls with neon pink, cyan, and yellow hues against a black background, reminiscent of an artistic, radiographic interpretation.

If there are people falling in love with chatbots powered by generative AI, you can bet there are people creating chatbots of deceased loved ones. I can understand the temptation, but I can’t see it being a particularly healthy one. And as this article points out, the vectors for disintformation and manipulation is immense.

Known as “deadbots” or “griefbots,” these AI clones are trained on data about the deceased. They then provide simulated interactions with virtual recreations of the departed.

This “postmortem presence” can social and psychological harm, according to researchers from Cambridge University.

Their new study highlights several risks. One involves the use of deadbots for advertising. By mimicking lost loved ones, deadbots could manipulate their vulnerable survivors into buying products.

Another concern addresses therapeutic applications. The researchers fear that these will create an “overwhelming emotional weight.” This could intensify the grieving process into endless virtual interactions.

Source: The Next Web