The age-old human desire for immortality and the ability to communicate with the dead is undergoing a profound technological transformation. The death tech industry, now estimated to be worth roughly $125 billion globally, is being reshaped by advancements in artificial intelligence (AI) that make the replication of human existence more possible than ever, as explored in the video "AI Is Ruining Death Now, Too Incognito Mode," a contribution from WIRED.
One segment of this industry focuses on physical preservation, known as cryopreservation or cryonics. Around six hundred people worldwide have chosen this path, hoping medical technology will eventually advance sufficiently to cure their ailments. The process involves slowly cooling a legally dead body in an ice bath, pumping it full of antifreeze to protect the cells, and then moving it to a liquid nitrogen freezer (cryostat) once the body reaches -two hundred degrees. Despite costs ranging from around thirty thousand dollars up to around two hundred thousand dollars, scientists confirm that successfully defrosting and bringing back an entire human body made up of forty trillion cells is not currently on the horizon.
However, the most immediate disruption comes from AI, which is being used to memorialize people in new ways. A large portion of the death tech industry uses AI to recreate people in text, image, or voice form.

Related article - Uphorial Shopify

Griefbots are the most basic technology, created from publicly available text or, in more sophisticated versions, private communications like emails or text messages. More advanced are AI avatars, which are image recreations of a person's likeness, allowing for new videos or even a digital "FaceTime call" with the deceased. These sophisticated services, which require studio time and long questionnaires, are reported to cost around ten thousand dollars to create in advance.
These AI recreations are already appearing in legal and public spaces. In May of twenty twenty-five, an AI avatar of Christopher Pelky gave a statement in court more than three years after his death, in what his family believed would be his own words. Separately, journalist Jim Acasta conversed with an AI avatar of Walking Oliver, one of the seventeen victims of the two thousand eighteen Parkland shooting, who answered questions about his death and gun laws.
This new frontier, however, is rife with ethical and regulatory concerns. Experts are warning about "AI hauntings," where individuals could be distressed by being spammed with unwanted messages from deceased loved ones. Furthermore, since AI in the United States is currently "underregulated," there are few barriers stopping companies from exploiting a deceased loved one's data for marketing purposes or using their likeness in advertising.
Psychologists caution about the mental health risks, noting that continuing a relationship with someone who has passed via an AI avatar might prolong the grief process or limit the ability to accept the loss. Such technology, when used to remove hard experiences like loss, "never really works" as fantasized and "can't take away our pain".
The issue of rights and consent is particularly fraught in entertainment. Celebrities who died before this technology existed, such as Marilyn Monroe, are seeing digital reincarnations without their explicit consent. Actors now alive are considering the implications of their likenesses being used in projects they don’t agree with. While actor James Earl Jones consented to having his voice preserved for the character of Darth Vader, celebrity chef Anthony Bourdain's voice was cloned for a two thousand twenty-one documentary without his consent, leading to public deception when the use of the clone was not initially disclosed.
As this technology becomes more democratized, individuals must acknowledge that their digital footprint is increasingly becoming a huge part of what they leave behind. To manage this digital existence, experts advise people to address their digital life in their will, clearly stating what should happen to their data and accounts. For those concerned about an unauthorized AI avatar being created, the primary recommendation is to limit the amount of data—including social media posts and videos—that is publicly available.