Briefly
- 2Wai’s app generates conversational video avatars from a couple of minutes of recordings, drawing comparisons to “Black Mirror.”
- Critics warn the tech exploits susceptible mourners and operates in a authorized grey zone with weak autopsy privateness protections.
- The launch intensifies scrutiny of a grief-tech business scuffling with consent, information possession, and the dangers of AI-generated “digital ghosts.”
A synthetic intelligence startup co-founded by a former Disney Channel actor has launched a cell app permitting customers to create interactive digital replicas of deceased family members, prompting swift on-line condemnation and renewed scrutiny of the burgeoning “grief tech” sector.
2Wai, based by Calum Worthy—identified for portraying Dez on the Disney sequence “Austin & Ally” from 2011 to 2016—and producer Russell Geyser, launched its iOS beta on November 11. The app’s “HoloAvatar” function generates conversational video avatars from as little as three minutes of uploaded footage, audio, and textual content inputs, enabling real-time chats in over 40 languages.
Whereas marketed as a software for legacy preservation, the deceased-recreation functionality has dominated headlines, evoking comparisons to the dystopian 2013 Black Mirror episode “Be Proper Again,” by which a grieving widow animates her late husband’s digital ghost.
The promotional video, posted by Worthy to his X account with 1.2 million followers, has garnered 22 million views and over 5,800 replies. It depicts a pregnant girl video-calling an AI recreation of her late mom for recommendation, then fast-forwards to the avatar studying bedtime tales to her new child—and later counseling her grownup grandson, performed by Worthy.
“What if the family members we’ve misplaced could possibly be a part of our future?” the clip asks. “With 2Wai, three minutes can final without end.” Worthy adopted up: “At 2wai, we’re constructing a residing archive of humanity, one story at a time.”
Mechanics and origins
HoloAvatars run on 2Wai’s proprietary FedBrain expertise, which processes interactions on-device to make sure privateness and restrict responses to user-approved information, lowering AI “hallucinations.” The app additionally helps residing customers creating avatars for fan engagement or teaching—Worthy’s personal digital twin shares behind-the-scenes Disney anecdotes.
At the moment free in beta, it’ll transition to a tiered subscription mannequin, with pricing undisclosed however probably $10-$20 month-to-month based mostly on comparable AI providers.
The enterprise traces to the 2023 SAG-AFTRA strikes, the place performers protested unauthorized AI likenesses.
“Having labored as an actor, author, and producer for the final 20 years, I skilled firsthand how difficult it’s to create a significant relationship with followers all over the world,” Worthy stated on the June launch. “Language boundaries, time zones, and budgets restrict the power to actually join.”
2Wai raised $5 million in pre-seed funding in June from undisclosed traders, with the agency saying it’s working with the likes of British Telecom and IBM.
Moral and privateness considerations
Public response has been overwhelmingly damaging, with X customers decrying the app as “nightmare gas,” “demonic,” “dystopian,” and an exploitative commercialization of grief.
One viral reply referred to as it “probably the most evil, psychotic issues I’ve ever seen,” arguing it “turns human beings psychotic” by simulating loss somewhat than processing it. One other labeled it “past vile,” insisting “movies try this” for archiving, not AI guesswork.
‘Are you positive you need to cancel your subscription and by no means speak to your useless dad and mom once more?’
You’re a psychopath.
Get assist.
Cease constructing merchandise earlier than you actually harm somebody.
— Alex Napier Holland 🦍 (@NapierHolland) November 13, 2025
Authorized consultants level out that demise bots sit in a authorized and moral grey zone, as a result of they are often constructed with out the decedent’s express consent, expose deeply private information of each the deceased and the grieving, and create ambiguities round possession of the digital avatar and accompanying information.
Privateness legal guidelines usually defend residing folks and provide little or no autopsy safeguards, leaving surviving family members susceptible to industrial exploitation of grief via subscription fashions and unregulated entry to interviews, voice recordings, and different delicate supplies. Moreover, the capability of such bots to work together, be taught, and deviate from recorded information raises dangers to the deceased’s legacy and challenges how society navigates mourning, reminiscence, and the significant closure of loss.
The app contains opt-in necessities and household approvals for deceased avatars, however critics query enforcement. “You’re preying on the deepest human emotions, in search of methods to leverage them to your revenue,” one X consumer wrote, calling the creators “parasites.”
Investor and business views
2Wai’s funding displays cautious optimism in AI companionship, however grief monetization stays a “third-rail” area of interest. Enterprise companies have shied from related startups amid moral pitfalls; Everlasting Digital Property, a cemetery-AI hybrid, closed final yr because of excessive churn.
2Wai joins a crowded grief-tech discipline. HereAfter AI (based 2019) builds “Life Story Avatars” from pre-death interviews, emphasizing consent. StoryFile provides interactive movies from recorded classes, used at memorials like Ed Asner’s; it filed for Chapter 11 chapter in 2024, owing $4.5 million however is reorganizing with information fail-safes.
Replika, a chatbot service launched in 2017, lets customers mimic the deceased by way of textual content or calls, however confronted backlash after a 2023 replace “killed” personalised bots, and a Belgian man’s 2023 suicide linked to eco-anxiety chats with it.
No federal guidelines govern posthumous digital likenesses, however California’s AB 1836 (signed in September 2024) bans unauthorized AI replicas of deceased performers’ voices or visuals in audiovisual works with out property consent, with penalties as much as $10,000 or precise damages. Lawmakers are eyeing extensions to non-celebrities, fueled by election deepfakes.
2Wai didn’t instantly reply to Decrypt’s request for remark.
Usually Clever E-newsletter
A weekly AI journey narrated by Gen, a generative AI mannequin.

