Critics of Nightshade have labeled it a "virus" and questioned its legality, but project leader Ben Zhao asserts it is perfectly legal and aims to force tech companies to pay for licensed work. The team behind Nightshade is also responsible for Glaze, a tool that distorts how AI models perceive and determine artistic style. Both tools are recommended for use by artists before sharing their work online, and are part of a wider effort to protect artists' rights in the digital age.
Key takeaways:
- The University of Chicago has developed a project called Nightshade that helps artists protect their work from being used to train AI models without consent by 'poisoning' image data, making it useless or disruptive to AI model training.
- Nightshade subtly changes the pixels in images to trick AI models into interpreting a completely different image than what a human viewer would see, which can corrupt the AI's understanding of the image.
- Artists are encouraged to use both Glaze and Nightshade before sharing their work online to protect their work from mimicry and unauthorized training.
- The ultimate goal of Glaze and Nightshade is to incur an 'incremental price' on each piece of data scraped without permission, until training models on unlicensed data is no longer tenable, forcing companies to license uncorrupted images to train their models.