Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

This new data poisoning tool lets artists fight back against generative AI

Oct 23, 2023 - technologyreview.com
A new tool named Nightshade has been developed to allow artists to add invisible changes to their artwork, which can disrupt AI models if the art is used without permission. The tool is designed to combat AI companies that use artists' work without consent to train their models. The changes made by Nightshade can cause AI models to misinterpret images, potentially rendering some outputs useless. The tool is being integrated into another tool called Glaze, which allows artists to mask their personal style to prevent it from being scraped by AI companies.

Nightshade exploits a security vulnerability in generative AI models, causing them to malfunction when they scrape poisoned images from the internet. The poisoned data is hard to remove, requiring each corrupted sample to be found and deleted individually. The tool has been tested on several AI models, with successful results. However, there are concerns that the data poisoning technique could be used maliciously. Despite this, researchers believe that the tool could help to make AI companies respect artists' rights more.

Key takeaways:

  • A new tool called Nightshade allows artists to add invisible changes to their artwork that can disrupt AI models if the art is used without permission. This is intended to deter AI companies from using artists' work without consent.
  • Nightshade exploits a security vulnerability in generative AI models, causing them to malfunction when they incorporate the altered images into their training data. This can result in bizarre outputs, like dogs appearing as cats or cars as cows.
  • The team behind Nightshade also developed Glaze, a tool that allows artists to mask their personal style to prevent it from being scraped by AI companies. The team plans to integrate Nightshade into Glaze and make it open source.
  • While there are concerns that the data poisoning technique could be used maliciously, it would require thousands of poisoned samples to significantly impact larger models. Researchers are calling for work on defenses against such attacks.
View Full Article

Comments (0)

Be the first to comment!