Creative Artists Agency (CAA), a leading entertainment and sports talent agency, aims to take a pioneering role in AI protection services for Hollywood celebrities.
With numerous stars having their digital likeness used without consent, CAA has developed a virtual media storage system for high-profile talent—actors, athletes, comedians, directors, musicians, and more—to store their digital assets, including names, images, digital scans, voice recordings, and other elements. This initiative is part of “theCAAvault,” the company’s studio where actors can record their body scans, faces, movements, and voices using scanning technology to create AI clones.
CAA has partnered with AI tech company Veritone to provide its digital asset management solution, as announced earlier this week.
The announcement comes at a time when AI deepfakes of celebrities are on the rise, often created without their consent. Tom Hanks, a renowned actor and client of CAA, was a victim of an AI scam seven months ago. He reported that a company used an AI-generated video of him to market a dental plan without his permission.
“In recent years, there has been significant misuse of our clients’ names, images, likenesses, and voices without proper consent, credit, or compensation. Clearly, the current legal framework is insufficient to protect them, leading to numerous ongoing lawsuits,” Shannon said.
Creating digital clones requires extensive personal data, raising many privacy concerns due to the potential misuse or exposure of sensitive information. CAA clients can now securely store their AI digital doubles and other assets within a personalized hub in the CAAvault, accessible only by authorized users, allowing them to share and monetize their content as they deem appropriate.
“This initiative allows us to set precedents for consent-based AI usage,” CAA’s head of strategic development, Alexandra Shannon, told Truth Voices. “We recognize that law will take time to adapt, so by enabling talent to create and own their digital likeness with [theCAAvault], there’s now a legitimate pathway for companies to collaborate with our clients. If a third party opts not to follow proper channels, legal cases can more easily demonstrate an infringement of rights, thereby protecting clients over time.”
Importantly, the vault ensures that actors and other talent receive fair compensation when their digital likenesses are used by companies.
“All these assets are owned by the individual client, allowing them to control access. Clients can decide the most appropriate business model for opportunities. This is an emerging space, and we believe these assets will grow in value over time. This shouldn’t be a cheaper way to work with somebody; we see [AI clones] as an enhancement rather than a cost-saving measure,” Shannon added.
CAA also represents major figures like Ariana Grande, Beyoncé, Reese Witherspoon, Steven Spielberg, and Zendaya, among others.
The use of AI cloning has generated considerable debate in Hollywood, with some fearing it could reduce job opportunities as studios might opt for digital clones over real actors. This issue was a notable topic during the 2023 SAG-AFTRA strikes, which concluded in November after an agreement with AMPTP (Alliance of Motion Picture and Television Producers) that emphasized the importance of human performers and included guidelines on the use of “digital replicas.”
Concerns also exist about the unauthorized use of AI clones of deceased celebrities, which can be troubling for their families. For example, Robin Williams’ daughter spoke out against an AI-generated voice recording of her father. However, some argue that ethical use of AI can nostalgically preserve and reintroduce iconic performances to future generations.
“AI clones are a powerful tool to ensure legacies endure for future generations. CAA maintains a consent-based approach for all AI applications and collaborates only with estates that hold permissions for these likeness assets. Artists decide who can own and use their likeness after their passing,” Shannon noted.
Shannon refrained from disclosing which CAA clients are currently utilizing the vault to store their AI clones but mentioned it remains a select few. CAA does charge a fee for clients to participate in the vault, although specific costs were not disclosed.
“Our ultimate goal is to make this available to all our clients and anyone within the industry. While it is not inexpensive, we anticipate costs will decrease over time,” she added.