Digital clone debate: AI avatars for creators
AI digital – Celebrities and business leaders are using AI replicas and chatbots. A startup CEO and a nonprofit founder weigh the promise and risks.
A growing slice of the media economy is experimenting with the idea that “you” can be duplicated in software.. From shopping livestreams to coaching apps. digital avatars are moving fast. raising a basic question for creators and businesses alike: is building a digital clone a smarter way to grow. or a risk that could backfire?
In recent months, celebrities, influencers, and business leaders have been using artificial intelligence to create digital replicas of themselves.. Some of these avatars are designed for e-commerce or marketing. while others are built to act like an always-on digital version of a person’s work or personality.. The technology is spreading across multiple parts of the media business, not just among novelty accounts.
One example highlighted in the debate is a Chinese influencer. Luo Yonghao. whose AI avatar helped drive more than $7 million in product sales during a livestream last year.. In parallel. other creators are training chatbots on their own body of work. aiming to give fans a version of their “mind” that can be accessed at any time.. That shift—from passive content to interactive. conversational replicas—is part of what makes the trend feel like more than a gimmick.
The adoption is also bleeding into mainstream entertainment and consumer services.. An AI replica of deceased actor Val Kilmer is set to appear in the film “As Deep as the Grave” later this year.. Meanwhile, fans of motivational speaker Tony Robbins are reportedly paying $39 a month for life coaching from his AI replica.. These cases illustrate how digital clones are being packaged not only as entertainment. but as an ongoing product with a subscription price.
At the center of the controversy and opportunity is the question of monetization and audience reach.. Dara Ladjevardian. cofounder and CEO of Delphi. argues that a digital clone can broadcast ideas in a way that expands what a creator can do with limited time.. She says a replica can engage with audiences on a creator’s behalf. helping surface conversations and opportunities that might otherwise compete for attention when the real person can only be in one place at a time.
Ladjevardian also frames digital cloning as an interactive evolution of existing media experiences.. She compares the idea to art: viewers can experience parts of an artist through a painting. and now audiences can engage with an interactive version of a creator’s mind.. For business operators. the appeal is clear: instead of one-way messaging. the technology can turn a creator’s work into a persistent interface.
On the other side. Will Kreth. founder of Human & Digital and HAND. a nonprofit focused on authenticating the identities of human actors. professional athletes. and other public figures. is skeptical.. He argues that there is something fundamentally “off” about representation when AI is used to mimic humans. describing an uncanny valley effect where the output can feel unnatural to audiences.. His point isn’t only about quality; it is also about how people react emotionally to seeing a synthetic version of a real person.
The debate also touches on what makes synthetic “creators” compelling in the first place.. Kreth raises a concern about the limits of large language models as creators of truly new ideas.. While an LLM can combine inputs and draw conclusions. he questions whether it can make the next leap of insight—the kind that leads to genuinely novel thinking that makes people say they never would have imagined it otherwise.
Monetization is where the economics become especially tangible.. Ladjevardian points out that creators already monetize books and courses. and she believes digital replicas introduce a new format that can function like a 24/7 mentor.. She describes creators monetizing the service as an add-on to communities or courses. including “24/7 office hours. ” while others offer replicas for free and use audience interaction to support more indirect selling in conversation.
In her view, the biggest beneficiaries are likely to be the most trusted creators.. Ladjevardian says the top tier of creators will gain the majority of the benefits because their reputation and audience trust can translate into willingness to engage with an avatar that represents them.. That implies a winner-takes-more dynamic: established brands may convert loyalty into recurring interactions. while newcomers may struggle to generate the same credibility fast enough.
A core fear in these systems is authenticity and whether synthetic versions degrade the real brand.. Kreth responds with a blunt test: if a knockoff version “bombs. ” it could reflect badly on the person it is meant to represent.. The underlying risk is reputational. but it is also structural: when an avatar speaks on your behalf. it inherits responsibility for outcomes. tone. and perceived legitimacy.
Ladjevardian counters by emphasizing positioning and transparency.. She notes that some of the most successful creators on her platform discuss the technology openly. using it to explain why it can help them answer questions they otherwise would not have the bandwidth to handle.. She also argues that abundance can increase value: when more people talk to digital minds. she says they may become more interested in meeting the real person.
The debate then shifts into the human side of the business model—relationships, not just transactions.. Kreth points to concerns about parasocial bonds. noting that social media networks and large language model companies have faced lawsuits tied to allegations that users form unhealthy emotional attachments to products.. He worries that some people may not be emotionally equipped to handle the potential for addiction to the relationship. especially when platforms lack adequate guardrails.
Ladjevardian does not dismiss the idea of parasocial connection, but she suggests the boundary is not new.. She recalls how. as a child. her sister fixated on Johnny Depp after watching “Pirates of the Caribbean. ” keeping a cardboard cutout in her room.. Her argument is that fans have always formed “fantasy” relationships around media figures they don’t have access to directly—and she contends that interactive digital replicas can even make that engagement feel more realistic rather than less.
The question becomes even sharper when considering what happens if synthetic people proliferate across media.. Kreth says frameworks are not in place and suggests the market could “flood the zone” as these tools become more common.. He argues that the risk is not limited to deepfakes; it includes “shallow fakes. ” where content is close enough to be believable because it mixes true elements with lies.
For the broader ecosystem, Kreth says the industry needs traceability and observability—documented authentication and consent. He warns that any weak link in the chain of verification could undermine trust and enable misuse, especially when synthetic content scales faster than society can adapt.
Ladjevardian agrees that authenticity infrastructure matters, including automated takedowns for deepfakes and watermarks that help signal verification.. But she says that even with those safeguards. the more serious challenge may be the sheer volume of synthetic characters and agents compared with the number of authentic human replicas.. In her framing. the danger is less about individual replicas and more about a flood of synthetic agents that risks eroding what audiences and markets can reliably treat as real.
For businesses deciding whether to build or license digital replicas. the debate highlights a practical tension: scalable engagement versus durable trust.. As avatars become tools for commerce. entertainment. and subscription services. companies will need to think beyond novelty features and address how legitimacy is established. how errors are handled. and how audience vulnerability is protected.. In a media market where trust is often the main asset. the question is not only whether a digital clone can perform—but whether it can do so in a way that remains credible over time.
AI digital clones creator monetization AI avatars deepfakes media authenticity parasocial risk synthetic replicas