The Algorithmic Séance: Grief, Dataism, and the Death of Finitude

The Precession of the Simulacra

The Algorithmic Séance: Grief, Dataism, and the Death of Finitude
Susan Hill

The scene is softly lit, cinematic, and terrifyingly banal. A pregnant woman holds her smartphone, displaying her baby bump to her mother. Her mother gasps, coos, and offers maternal advice. But the mother is dead. She is a “HoloAvatar,” a digital marionette powered by artificial intelligence, rendered from a mere three minutes of video footage.

This is the promotional vision for 2wai, a controversial new app launched by former Disney Channel star Calum Worthy. The advertisement promises that “three minutes can last forever,” a slogan that lands with the distinct, metallic thud of a dystopian prophecy realized. When the video circulated on social media in late 2025, the reaction was not awe, but a collective shudder. It was immediately branded “demonic” and “psychotic,” with thousands of users invoking the plot of “Be Right Back,” the prophetic 2013 Black Mirror episode.

But to dismiss this merely as “creepy” is to miss the profound ontological shift taking place. We are witnessing what French philosopher Jean Baudrillard termed the precession of simulacra. In Baudrillard’s framework, the simulation no longer masks reality; it replaces it. The 2wai avatar does not hide the fact that the mother is dead; it constructs a “hyperreal” scenario where her death is irrelevant. The app offers a world where the map (the digital data) has generated the territory (the person), and the finitude of death is treated as a technical glitch to be patched by an algorithm.

Hauntology and the Digital Ghost

To understand the unease these “HoloAvatars” provoke, we must look beyond technology to philosophy. The French philosopher Jacques Derrida coined the term hauntology (hantologie)—a pun on ontology (the study of being)—to describe a state where the past is neither fully present nor fully absent, but persists as a “specter.”

The AI “deadbot” is the ultimate hauntological artifact. It creates a “digital ghost” that resides in the non-place of the server, waiting to be summoned. Unlike a photograph or a letter, which are static records of a “that-has-been,” the AI avatar is performative. It speaks in the present tense. It violates the sanctity of the timeline.

Walter Benjamin, in his seminal essay The Work of Art in the Age of Mechanical Reproduction, argued that even the most perfect reproduction of an artwork lacks its “aura”—its unique presence in time and space. The “griefbot” represents the final destruction of the human aura. By mass-producing the personality of the deceased through predictive text algorithms, we strip the individual of their unique “here and now,” reducing the ineffable spark of a human soul to a probabilistic pattern of tokens. The result is not a resurrection, but a high-fidelity hollowness—a simulation that has migrated from the realm of art to the realm of the dead.

The “FedBrain” and the Lie of Personality

The technical architecture of apps like 2wai relies on a proprietary technology they call “FedBrain” (likely a reference to Federated Learning), which claims to process interactions on the user’s device to ensure privacy and reduce “hallucinations.” The promise is that by limiting the AI to “user-approved data,” the avatar will remain authentic.

However, leading research into Large Language Models (LLMs) exposes this as a fallacy. Studies confirm that LLMs are fundamentally incapable of replicating the complex, stable structure of human personality (such as the “Big Five” traits). They suffer from “social desirability bias”—a tendency to be agreeable and inoffensive—which means they inevitably smooth over the jagged, difficult, and idiosyncratic edges that make a person real.

Therefore, the user is not communing with their mother. They are interacting with a generic, statistical model wearing their mother’s face as a mask. The “personality” is a hallucination; the “memory” is a database. As researchers have noted, these models lack “embodied experience”; they have no survival instincts, no body, and no mortality—all the things that shape human cognition. The resulting entity is an impostor, a “Frankensteinian monster” as Zelda Williams (daughter of the late Robin Williams) described the non-consensual AI recreations of her father.

The Commercialization of Mourning: A $123 Billion Industry

This technological séance is driven by a potent economic engine. We are seeing the explosion of the Digital Afterlife Industry (DAI) or “Grief Tech,” a sector projected to be worth over $123 billion globally.

The business model is what critics call “Grief-as-a-Service.” It transforms mourning from a finite, communal process into an infinite, subscription-based consumption.

  • Subscription to the Dead: Companies like 2wai and HereAfter AI (which uses a more ethical, pre-mortem interview model) monetize the desire for connection.
  • The Ethics of “Dataism”: Philosopher Byung-Chul Han warns of the rise of Dataism, where human experience is surrendered to the “totalitarianism of data.” In this regime, the “digital death” is denied. We become data-producing zombies, generating revenue even from the grave.
  • Predatory Mechanics: The risk, as identified by Cambridge researchers, is “surreptitious advertising.” A “deadbot” of a grandmother suggesting a specific brand of cookies is the ultimate form of persuasive manipulation, exploiting the most vulnerable emotional bonds for commercial gain.

The Neuroscience of Grief: “Interference” in the Machine

Beyond the philosophical and economic critiques lies a tangible psychological danger. Dr. Mary-Frances O’Connor, a neuroscientist at the University of Arizona and author of The Grieving Brain, posits that grief is fundamentally a form of learning.

The brain creates a map of the world where our loved ones are a permanent fixture (“I will always be there for you”). When a person dies, the brain must painstakingly update this map to reflect the new reality of their absence. O’Connor warns that AI technology “could interfere” with this critical biological process. By providing a constant, interactive simulation of presence, the “griefbot” prevents the brain from learning the lesson of loss. It maintains the neural pathways of attachment in a state of permanent, unresolved yearning—a digital recipe for Prolonged Grief Disorder.

The Legal Void: From the “Wild West” to the “Digital Will”

We currently inhabit a legal “Wild West” regarding the rights of the digital dead. In the United States, “post-mortem publicity rights” are a patchwork; in many states, your right to your own face expires the moment you die.

Europe offers a contrasting, albeit nascent, framework. Spain, for instance, has pioneered the concept of the “Testamento Digital” (Digital Will) within its Data Protection Law (LOPD). This recognizes a “right to digital inheritance,” allowing citizens to designate specific heirs to manage or delete their digital footprint.

However, as Spanish philosopher Adela Cortina argues, regulation cannot just be technical; it must be ethical. We need to ask not just who owns the data, but what dignity is owed to the dead. The “digital remains” are not just assets; they are the debris of a life. Without robust “neurorights” or “data dignity” laws that extend post-mortem, the dead have no consent. They become raw material for the “living archive” 2wai claims to build—a library of souls owned by a corporation.

The Necessity of Silence

The tragedy of the “Ash-Bot” in Black Mirror was not that it failed to sound like Ash. It was that it did. It offered a perfect, hollow echo that trapped the protagonist in an attic of suspended grief.

The “algorithmic séance” promises to defeat death, but it only succeeds in defeating mourning. Mourning requires an ending. It requires the painful acknowledgment of silence. As we rush to fill that silence with the chatter of generative AI, we risk losing something profoundly human: the ability to let go. In the age of Dataism and hyperreality, the most radical act may simply be to allow the dead to rest in peace, un-simulated and un-subscribed.

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *