AI resurrection can turn your grief into “spectral labor”

Date:

Share:



Generative AI is getting good at making the dead talk. The newest critique isn’t about whether it sounds real. It’s about what happens when a person’s voice, face, and emotional presence get rebuilt into something that can be reused.

In a 2025 paper in New Media and Society, researchers Tom Divon and Christian Pentzold call this “spectral labor.” The concept frames AI resurrection as a form of posthumous production, where a person can keep “working” through their data after death. That can happen without consent, and without any clear guardrails.

“What we resurrect may not be what we remember, but what technology renders back to us.” That gap is why the output can feel less like closure and more like a copy shaped by the toolmaker.

The three modes of resurrection

Divon and Pentzold analyzed 51 AI resurrection cases collected between January 2023 and June 1, 2024, spanning the US, Europe, the Near East, and East Asia. They sort them into spectacle, sociopolitical use, and everyday grief use.

Spectacle is the glossy version, icons restaged for entertainment. Sociopolitical projects re-invoke the dead for testimony or messaging. The everyday mode is the most intimate, chatbots and synthetic media built to simulate ongoing contact. It’s also the easiest to normalize. Fast.

When presence becomes a product

The paper’s sharpest line is its labor claim. The authors write that “the dead become involuntary sources of data, likeness, and affect.” In this framing, a person’s traces become raw material, then a sellable presence that can be extracted, distributed, and monetized.

In a separate essay, the authors argue the unease isn’t only about realism. It’s about agency. These figures can look responsive while still being authored by someone else’s prompts, edits, and platform rules. It can feel personal but isn’t.

What you should do now

The research argues consent, privacy, and end-of-life choices need a rethink as personal traces get folded into generative systems. Governance still lags behind how quickly these tools can be built and shared.

For you, the practical move is to treat your voice, images, and accounts like assets. Decide who can access them, and put those instructions in writing where possible.

If you’re considering an AI “afterlife” service, ask one question first. Who gets to decide what your future version says.



Source link

━ more like this

Foodservice price inflation accelerates in December as festive demand peaks    – London Business News | Londonlovesbusiness.com

Food and drink prices in the hospitality sector rose by 1.1% in December, according to the latest edition of the Foodservice Price Index...

OnePlus’ safety-first charging mode now goes beyond games and reaches more phones

A couple of days ago, OnePlus announced bypass charging for the OnePlus 13s, limited to the Indian region. Now, the feature is rolling...

Uncertainty creeps in about the Fed’s future – London Business News | Londonlovesbusiness.com

Footsie set to end the week flat, after stocks slip back Stateside. Gold retreats from record highs as investors take profits after its glittering...

‘Failure to prepare’ for winter has left A&E patients out at sea – London Business News | Londonlovesbusiness.com

A predictable surge in norovirus is plunging Emergency Departments further into crisis because of a failure to prepare for winter. That is the Royal College of...

Starlink, symbolism and a crisis of trust: What Ukrainians see when the White House sends signals – London Business News | Londonlovesbusiness.com

Wars are not fought only with artillery and drones. They are fought with trust, in allies, in institutions, in technology and in leadership. When...
spot_img