YouTube officially launches likeness-detection technology
- By Kumail Shah -
- Oct 22, 2025

YouTube announced on Tuesday that its likeness-detection technology has officially been made available to eligible creators in the YouTube Partner Program, following a pilot phase. This technology enables creators to request the removal of AI-generated content that features their likeness.
This rollout is the first of its kind. A YouTube spokesperson told TechCrunch that eligible creators received emails about this update this morning.
The detection technology of YouTube recognizes and manages AI-generated content featuring the likeness of creators, such as their face and voice.
The likeness-detection is designed to deter people from having their likeness misused, whether for supporting products and services they have not agreed to sustain or for disseminating information. There have been a host of AI likeness misuse in recent years, such as the company ‘Elecrow’ using an AI clone of YouTuber Jeff Geerling’s voice to advertise its products.
On the YouTube Insider channel, the company gave instructions on how creators can use the technology. To start the onboarding process, creators must navigate to the “Likeness” tab and agree to data processing. They then need to scan a QR code with their smartphone, which will take them to a web page for identity verification. This verification requires a photo ID and a short selfie video.
Once YouTube permits access to the tool, creators can view all detected videos and submit a removal request according to the company’s privacy guidelines; moreover, they can also make a copyright request. A video archiving option is also available.
YouTube video creators can opt out of using this technology at any time, and YouTube will halt scanning for videos 24 hours after they do so.
Since early this year, likeness-detection technology has been in pilot mode. YouTube initially announced a partnership with Creative Artists Agency (CAA) last year to assist celebrities, athletes, and creators in identifying content on the platform that utilizes their AI-generated likeness.
In April, YouTube voiced its support for the NO FAKES Act, legislation designed to tackle AI-generated replicas that mimic a person’s image or voice to deceive and create harmful content.