The platform is also using AI to detect content that violates its rules. Updates to YouTube's privacy complaint process will allow for the removal of AI-generated videos that simulate an identifiable person. Music partners can also request the takedown of AI-generated music content that imitates an artist's unique voice.
Key takeaways:
- YouTube is implementing new rules for AI content, requiring creators to disclose if they've used generative artificial intelligence to create realistic videos. Non-disclosure may lead to penalties including content removal or suspension from the platform's revenue sharing program.
- The new rules are an expansion of Google's policy from September, which required political ads using AI on YouTube and other Google platforms to have a prominent warning label.
- YouTube is also using AI to detect content that violates its rules, and is updating its privacy complaint process to allow for the removal of AI-generated videos that simulate an identifiable person.
- Music partners, such as record labels or distributors, will be able to request the removal of AI-generated music content that mimics an artist's unique voice.