The fight against unauthorized data scraping extends beyond artists to anyone who posts content online. Tools like Nightshade could help shift the power balance back to the content creators. Meanwhile, some companies like OpenAI and Stability AI have offered artists the option to opt out of training sets or have pledged to respect requests not to scrape their work. However, there is currently no mechanism to enforce these promises.
Key takeaways:
- A new tool called Nightshade developed by a lab at the University of Chicago can subtly alter the pixels of an image in a way that tricks machine-learning models into misinterpreting the image. This could potentially disrupt the training of AI models that use scraped images.
- Artists are increasingly protesting against the tech sector's practice of scraping their work off the internet to train AI models, with some even filing lawsuits over copyright.
- Some companies like OpenAI and Stability AI have offered to let artists opt out of training sets or have said they will respect requests not to have their work scraped. However, there's currently no mechanism to enforce these promises.
- Artists and other content creators are calling for tech companies to shift from opt-out mechanisms to asking for consent first and to start compensating artists for their contributions.