Shadows of the Deceased

At the behest of those left behind, AI can now bring the dead to life, which raises an ethical question or two.

Current technology is capable of offering relief for the pain of having lost a loved one: the chance to hear that person’s voice again. Inspired by a Black Mirror episode, ‘Be Right Back,’ artificial intelligence can recreate the personalities, voices, and even conversational quirks of the deceased, drawing from their digital footprints—text messages, voicemails, and social media posts.

Companies like HereAfter AI and Replika are pioneers in this space, enabling grieving families to interact with virtual avatars that feel eerily lifelike. For some, these digital echoes provide solace, a way to process grief by preserving a connection to those who have passed. A widow might hear her husband’s laugh; a child might ask a virtual parent for advice. 

Yet, this technology, while innovative, treads a fine line between offering comfort and engendering obsession. The emotional allure is undeniable, but so are the risks. These AI recreations, built on algorithms and data, lack true consciousness or agency. They are mirrors reflecting curated memories, not living souls. The bereaved may find themselves clinging to a synthetic version of their loved one, potentially stunting the natural grieving process. 

Graph: AI Investment

Ethical questions loom large: is it healthy to sustain such connections, or does it foster a dangerous detachment from reality? As AI grows more sophisticated, capable of mimicking human nuances with startling accuracy, society must grapple with whether these virtual resurrections honour the dead or trap the living in a digital illusion.

The Technology Behind Digital Ghosts

The mechanics of recreating a loved one through AI are both intricate and unsettlingly straightforward. Modern systems rely on vast datasets—emails, texts, voice recordings, and social media activity—to construct a digital persona. Machine learning algorithms, particularly large language models like those powering ChatGPT, analyse these inputs to replicate speech patterns, tone, and personality traits. 

Companies such as Eternime and MyHeritage’s Deep Nostalgia use natural language processing and deepfake technology to animate photos or generate lifelike audio. For instance, HereAfter AI interviews users while alive to create a bespoke digital archive, which can later be accessed as a conversational avatar. The results can be astonishingly realistic.

Yet, the technology is not flawless. These avatars can produce inconsistent responses, occasionally breaking the illusion with generic or incongruous replies. Moreover, the data required raises privacy concerns. Harvesting a deceased person’s digital footprint often involves navigating complex legal and ethical terrain, particularly if consent was not explicitly given. 

Critics argue that such systems commodify grief, with companies charging subscription fees—sometimes upwards of £100 annually—for access to these digital memorials. Proponents, however, see them as a natural evolution of how we do remembrance, akin to going through photos or letters, just more interactive. The question remains whether this technology serves as a tool for closure or is like opening a Pandora’s box, paving the way for users to become emotionally dependent on it.

Virtual Mourning

The emotional impact of AI recreations is profound, and thus fraught with peril. Psychologists are divided on their therapeutic value. Some, like Dr Elaine Kasket, author of ‘All the Ghosts in the Machine,’ argue that these avatars can aid the grief process by allowing users to express unresolved feelings.

In China, creating digital avatars of deceased loved ones can cost as little as 20 yuan ($3). Zhang Zeiwei, founder of Super Brain, an AI company launched in mid-2023, told the South China Morning Post that his firm has enabled ‘thousands’ of people to ‘digitally resurrect’ their loved ones using just 30 seconds of audiovisual data. These avatars range from simple chatbots paired with digital images to sophisticated 3D human models, offering varied ways to preserve and interact with memories of the departed.

Yet prolonged interaction may prevent the acceptance of loss, fostering a state of ‘digital liminality’ where the bereaved remain tethered to an artificial presence. Cases have emerged of users spending hours each day conversing with avatars, neglecting real-world relationships.

Ethical dilemmas extend beyond the individual. There is the question of consent: can the deceased truly agree to their digital resurrection? Without clear directives, companies risk exploiting sensitive data. Cultural attitudes also vary: Western societies may embrace such technologies, but in cultures where death is deeply ritualised, digital recreations might be seen as disrespectful. There is also the spectre of manipulation: could unscrupulous firms use these avatars to market products or influence decisions? 

As is often the case whenever there is rapid technological progress, regulation lags behind. Currently, no global framework exists to oversee this technology. As AI advances and becomes capable of ever-more convincing simulations, society must decide whether these digital spectres are of benefit to humanity or not. In the end, the technology that promises to keep our loved ones close may leave us grasping at shadows, unable to let go.

Statement

AI’s ability to recreate deceased loved ones, inspired by Black Mirror’s dystopian vision, offers a bittersweet balm. By mimicking voices and personalities, it provides solace but risks trapping users in a synthetic present. The technology, driven by sophisticated algorithms, is advancing rapidly, yet it raises profound ethical questions about consent, privacy, and emotional health. While some find comfort in these digital echoes, others warn of dependency and detachment from reality. As society navigates this uncharted territory, it must balance innovation with caution, ensuring that the pursuit of connection does not sever the living from authentic human bonds.