YouTube and CAA join hands to combat deepfake videos

YouTube's latest tool will empower celebrities to combat unauthorised uses of their likenesses
An undated image of several YouTube icon. — Freepik
An undated image of several YouTube icon. — Freepik

Due to the alarming rise in Deepfake videos featuring celebrities, YouTube announced on Tuesday its collaboration with the entertainment and sports agency Creative Artists Agency (CAA) to protect public figures from deepfakes from the exploitation of their identities.

YouTube has announced this advanced tool for the early detection of artificial intelligence (AI)-generated deepfake detection content, empowering celebrities to combat unauthorised uses of their likenesses.

YouTube revealed its plan to develop cutting-edge technology to assist creators in maintaining control over their likenesses, including the ability to detect and block videos that imitate facial features, voices, and other personal attributes.

According to YouTube, “several of the world's most influential figures” will receive early access to this advanced technology. However, the platform has not revealed the names, the platform confirmed that these celebrities will include “award-winning actors” and athletes from organisations such as the NBA and NFL.

This tool will enable celebrities to request the removal of unauthorised content. YouTube is gearing up to launch this initiative in 2025, initially focusing on celebrities and later on it will be expanded to a broader audience, including influencers, other famous creators, and more on the video-streaming platform. 

Through this groundbreaking initiative, YouTube seeks to address the rising concerns surrounding impersonation and the illegal use of AI-powered technology. 

“CAA's clients' direct experience with digital replicas in the evolving landscape of AI will be critical in shaping a tool that responsibly empowers and protects creators and the broader YouTube community,” the company said.