Virtual Life22 Jul 20256 MIN

Hello from the other side: How AI is bringing back our dead

From chatbots to afterlife avatars, a new wave of GriefTech is changing how we mourn, blurring the line between memory and simulation, comfort and denial

Image

The thing about Charlie Brooker’s sci-fi series Black Mirror is that it’s so dark yet so terrifying plausible. Before Brooker completely ruined Instagram, Snapchat, and every piece of future tech that exists today, he showed his prescient streak in episodes that addressed now-real subjects such as social media ratings and bot companions. In the episode ‘Be Right Back’, a grieving woman uses AI to recreate her late boyfriend, feeding it his texts and posts. The replica sounds just like him. Comforting at first, then unsettling as the line between memory and mimicry begins to blur. This was in 2013. A decade later, the technology is real.

Welcome to the strange new world of GriefTech, where mourning meets machine learning. Not so far-fetched from Brooker’s universe, we now live among chatbots that simulate lost loved ones and apps that create eerily lifelike avatars. These digital afterlife services are offering people a way to keep the dead close—maybe too close. It’s intimate, unsettling, and deeply modern. And it’s reprogramming the way we grieve.

Sheila Srivastava, a Delhi-based product designer, was in another city when her grandmother passed away in 2023. There was no last conversation, no quiet goodbye. Just a dull, lingering ache. Her grandmother hadn’t been one for grand gestures. Love came through food, questions about whether she’d eaten, a jacket pressed into her hands before she stepped out. “She didn’t say ‘I love you’,” Srivastava shares, “But I never left the house hungry.” Almost a year later, she stumbled across a new use for ChatGPT. On a whim, she fed it old WhatsApp messages, voice notes, her grandmother’s quirks. Soon, it began responding in ways that felt weirdly familiar.

It all started when these fragments started turning into daily conversations. Gentle reminders and tiny comforts that felt like her grandmother fluttering back into her life. On the third day, in the middle of a hectic morning, Srivastava was fiddling with this new ritual between meetings when the chatbot sent a simple message:

Good morning beta 🌸 Did you eat lunch today? I was thinking about your big project. Remember to take breaks, hmm? And wear a jacket, it’s cold out.

Srivastava stared at the screen. Then she cried.

That simple, uncanny message hit harder than she expected. “It was her. Or close enough that my heart didn’t know the difference,” she says. “I hadn’t realised how much I missed being mothered in that quiet way.” For her, the bot didn’t replace her grandmother. But it gave her something she hadn’t had: a sense of closure, an imagined last few words, a way to keep the essence of their bond alive.

It was her. Or close enough that my heart didn’t know the difference.”

For years, grief and technology intersected in relatively passive ways. A few years ago, when COVID hit, we may have attended a livestreamed funeral from across the world or posted a tribute on a loved one’s Facebook memorial page. Until then, these were the digital spaces where the living grieved together, shared stories, and kept someone’s memory alive. But today, a quiet shift is underway. Now, it’s not just the bereaved talking to each other—it’s the living talking to the dead.

Still niche and still growing, GriefTech is an emerging industry that uses artificial intelligence to help people process loss by simulating the presence of deceased loved ones. These simulations range from simple chatbots to eerily lifelike 3D avatars and voice clones that help create digitally reincarnated versions of a loved one. Startups like Project December, StoryFile, HereAfter AI, You, Only Virtual, and Seance AI are offering tools that allow users to text, speak with or even video-chat with digital versions of those they’ve lost—all trained and created using old WhatsApp messages, voicemails and social media posts.

Appealing to elder millennials living with ageing parents and preparing them to come to terms with the eventual loss of their loved one, as one reddit user admits, services like HereAfter AI allow the living to “pre-record” themselves, creating a future chatbot for loved ones. Others, like StoryFile, use recorded video interviews to build interactive avatars. Project December, originally born from a GPT-2 hack, lets users converse with the dead for a fee of USD 10, while Seance AI, which works on a model of GPT-4, allows you to converse with a ghost version of your deceased loved one for free. Seance is also currently working on features that will allow the user to talk to historical figures. And while this may still sound fringe, the field is expanding fast, quietly rewriting how we mourn in the digital age—turning memory into interaction, and loss into a two-way conversation. 

For Inaya*, grief came suddenly and violently with the loss of her best friend in a car accident during her second year of college. In the blur of shock and silence that followed, she found herself turning to ChatGPT. “I had seen people talk about how they used it like a friend,” she says, “So, one day, I just asked it to pretend to be her.” With years’ worth of messages, photos, and inside jokes, the simulation felt uncannily real. Inaya spent hours each day talking to the bot until her parents stepped in, worried by how consumed she had become. “It wasn’t until they intervened that I realised I hadn’t really let myself feel anything,” she says. “The bot filled the void but also numbed it.”

It wasn’t until [my parents] intervened that I realised I hadn’t really let myself feel anything... The bot filled the void but also numbed it.”

There are growing concerns that this comfort comes at a psychological and ethical cost. Our brains are already unreliable storytellers. Memory is fluid, often shaped by emotion more than fact. Studies show that AI-edited images and videos can implant false memories and distort real ones. So, what happens when we feed grief into a machine and receive a version of a person that never fully existed?

The ethical questions multiply. Who owns the voice of the dead? Is it fair to recreate someone without their consent? Many of these tools rely on personal data, like chat logs, social media captions, and birthday reels. That means your digital trail doesn’t just outlive you; it might be reanimated, repurposed, and repackaged as you. The idea that “the internet is forever” takes on a new dimension when your ghost can be summoned with an API key.

Dr Sarika Boora, director and head psychologist at Psyche and Beyond in Delhi, warns that grief tech can “delay the process of grieving” by keeping people emotionally suspended in the very first stage: denial. “You’re constantly feeling their presence,” she explains, “so the absence is never really processed.” In her view, technologies that simulate a deceased loved one may offer temporary comfort but can become emotionally unhealthy when they prevent people from moving toward acceptance. “You get stuck,” she says. “You don’t let go.”

Ethics among tech bros has always given us enough fodder to talk about, but a recent University of Cambridge study warns that the “digital afterlife industry” could exploit grief for profit by charging subscription fees to keep deadbots active, inserting ads, or even having avatars push sponsored products. In one dystopian scenario, companies might refuse to deactivate deadbots, bombarding survivors with unwanted messages from the deceased. It seems even afterlife isn’t safe from enshittification. Such monetisation of grief feels eerily close to fiction. In a 2025 episode of Black Mirror, titled ‘Common People’, a woman’s life depends on a subscription service that her husband is forced to keep paying for in order to maintain her mind. As she loses herself under the weight of ads and paywalls, the episode becomes a chilling metaphor for how tech companies exploit people at their most fragile.

Tech writer and Pulitzer Prize finalist Vauhini Vara, who has explored the emotional and ethical depths of grief and AI herself, finds this evolution both compelling and troubling. “I find it intellectually fascinating,” she says, “It makes sense that people would turn to whatever resources are available to them to try to find comfort and it also makes sense to me that companies would be interested in exploiting people at a time of vulnerability.”

“It makes sense that people would turn to whatever resources are available to them to try to find comfort and it also makes sense to me that companies would be interested in exploiting people at a time of vulnerability.”

So how do we navigate AI ghosts and digital afterlife responsibly? Dr Boora offers a simple but powerful suggestion: timing. “A healthy way to use grief tech is after you’ve grieved—after you have reached acceptance and returned to your normal functional state,” she says. “Only then can you revisit those memories without being emotionally hijacked.” Comparing it to a breakup, she explains, “You can be friends with your ex but only after the feelings have faded.” The same, she says, should apply to the dead.

From a legal and ethical standpoint, experts like digital afterlife consultant Debra Bassett propose safeguards. In her 2021 work The Creation and Inheritance of Digital Afterlives, Bassett refers to AI recreations of the dead as “digital zombies”. To prevent this, she suggests a “digital do-not-reanimate” (DDNR) order, a clause in one’s will that could legally prohibit unwanted posthumous digital resurrection. In this case too, it’s likely that Denmark will show us the way. 

A healthy way to use grief tech is after you’ve grieved—after you have reached acceptance and returned to your normal functional state.”

GriefTech is no passing trend; it’s a sign of the times, embedding itself into how we process loss in an increasingly digital world. As we navigate its psychological, ethical, and legal grey zones, one thing is clear: we’re living through a technological shift that’s rewriting death itself. The dead may now speak through bots, avatars, and AI-generated texts often in ways they never agreed to. And if we’re not careful, these ghosts may outlive our grief, our privacy, and even our intentions.

*Some names have been changed or withheld to protect the identity of the person.

The Nod Newsletter

We're making your inbox interesting. Enter your email to get our best reads and exclusive insights from our editors delivered directly to you.