Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Meta debuts a tool for watermarking AI-generated videos | TechCrunch

Dec 13, 2024 - techcrunch.com
The proliferation of generative AI has led to a significant increase in deepfake content online, with a reported 4x rise from 2023 to 2024, according to ID verification platform Sumsub. Deepfakes now account for 7% of all fraud, involving impersonations, account takeovers, and social engineering. In response, Meta has introduced Meta Video Seal, a tool designed to apply imperceptible watermarks to AI-generated videos. This open-source tool aims to enhance the detection of AI-generated content and protect originality, joining Meta's other watermarking tools like Watermark Anything and Audio Seal.

Meta Video Seal is designed to be robust against common video edits and compression, offering a hidden message feature to trace video origins. Despite existing watermarking technologies from companies like DeepMind and Microsoft, Meta claims its tool addresses shortcomings such as resilience to compression and scalability. However, adoption challenges remain, as industry players may prefer proprietary solutions. To encourage adoption, Meta is launching a public leaderboard, Meta Omni Seal Bench, to compare watermarking methods and organizing a workshop at the ICLR conference. Meta aims to foster collaboration with researchers and developers to advance watermarking technology.

Key takeaways:

  • Deepfakes have significantly increased, accounting for 7% of all fraud in 2024, according to Sumsub.
  • Meta has released an open-source tool called Meta Video Seal to apply imperceptible watermarks to AI-generated videos.
  • Video Seal is designed to be robust against common video edits and compression, but it faces challenges in adoption due to existing proprietary solutions.
  • Meta is launching a public leaderboard and organizing a workshop to encourage the adoption and development of watermarking technologies.
View Full Article

Comments (0)

Be the first to comment!