Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Artists Can Fight Back Against AI by Killing Art Generators From the Inside

Oct 23, 2023 - gizmodo.com
Researchers led by Ben Zhao, a professor of computer science at the University of Chicago, have developed a tool called "Nightshade" that subtly corrupts images at the pixel level to disrupt AI training. The tool is designed to protect artists' work from being used by tech companies to train AI without their permission. When enough of these distorted images are used in AI training, the AI model starts to malfunction, interpreting prompts incorrectly and producing distorted styles.

The team plans to integrate Nightshade into Glaze, another tool they developed that creates a "style cloak" to mask artists' images from AI. The tool will also be released on the open-source market. Other efforts to protect artists' work include watermarking IDs to identify AI-created images and legal action against companies that use copyrighted work for AI training. However, these measures do not prevent the initial data scraping used for AI training.

Key takeaways:

  • A group of researchers led by Ben Zhao, a professor of computer science at the University of Chicago, has developed a tool called "Nightshade" that can poison AI models that use images to train, by subtly manipulating the image at the pixel level.
  • The manipulated images, when used to train AI models, can cause the models to start misinterpreting prompts, effectively breaking down the model.
  • Zhao's team also developed Glaze, a tool that can create a "style cloak" to mask artists' images and mislead AI art generators. Nightshade is set to be integrated into Glaze and also released on the open-source market.
  • While these tools can't change existing models, they can potentially disrupt companies that actively use artists' work to train their AI, forcing them to manually find and remove poisoned images or reset training on the entire model.
View Full Article

Comments (0)

Be the first to comment!