Starting Monday, creators will see a checklist in the YouTube Creator Studio to identify if their videos contain realistic, AI-generated content. YouTube will then add a label in the video description noting that it contains altered or synthetic content. However, creators won't be required to disclose when the synthetic content is clearly unrealistic or used for productivity. Failure to use the new label on synthetic content that should be disclosed may result in penalties such as content removal or suspension from YouTube’s Partner Program.
Key takeaways:
- YouTube is introducing a new policy requiring creators to disclose when their videos contain AI-generated or manipulated content that appears realistic.
- Creators will see a new checklist when uploading videos, asking them to identify if their content includes synthetic elements that could mislead viewers.
- YouTube will label videos containing AI-generated content, with more prominent labels for videos on sensitive topics like politics. Content created with YouTube's own AI tools will also be clearly labeled.
- Creators who fail to disclose synthetic content consistently may face penalties, including content removal or suspension from YouTube's Partner Program.