Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Nightshade, the tool that ‘poisons’ data, gives artists a fighting chance against AI | TechCrunch

Jan 26, 2024 - techcrunch.com
The University of Chicago has developed a project called Nightshade that "poisons" image data to protect artists' work from being used without consent to train AI models. The tool subtly alters the pixels in images, causing AI models to misinterpret the image, and if enough "poisoned" data is used, the AI will start generating unrelated images. The tool is seen as a way for artists to fight back against companies that use their work without permission or compensation.

Critics of Nightshade have labeled it a "virus" and questioned its legality, but project leader Ben Zhao asserts it is perfectly legal and aims to force tech companies to pay for licensed work. The team behind Nightshade is also responsible for Glaze, a tool that distorts how AI models perceive and determine artistic style. Both tools are recommended for use by artists before sharing their work online, and are part of a wider effort to protect artists' rights in the digital age.

Key takeaways:

  • The University of Chicago has developed a project called Nightshade that helps artists protect their work from being used to train AI models without consent by 'poisoning' image data, making it useless or disruptive to AI model training.
  • Nightshade subtly changes the pixels in images to trick AI models into interpreting a completely different image than what a human viewer would see, which can corrupt the AI's understanding of the image.
  • Artists are encouraged to use both Glaze and Nightshade before sharing their work online to protect their work from mimicry and unauthorized training.
  • The ultimate goal of Glaze and Nightshade is to incur an 'incremental price' on each piece of data scraped without permission, until training models on unlicensed data is no longer tenable, forcing companies to license uncorrupted images to train their models.
View Full Article

Comments (0)

Be the first to comment!