Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Tool preventing AI mimicry cracked; artists wonder what’s next

Jul 04, 2024 - arstechnica.com
The article discusses the growing concern among artists about AI image generators replicating their unique styles, diluting their brands, and potentially replacing them in the market. This has led to an increased demand for tools like Glaze, developed by University of Chicago professor Ben Zhao, which adds imperceptible noise to images to prevent AI from copying artists' styles. However, the tool has been criticized by security researchers who claim it can be easily bypassed, and there is a growing backlog of requests for access to it.

The Glaze Project has seen a surge in requests for its free tools, designed to prevent style mimicry and discourage data scraping without an artist's consent. However, the demand is so high that the team, mostly volunteers, is struggling to keep up with the approval process. The article also mentions the case of Adobe selling AI images that copied the style of famous photographer Ansel Adams, which was condemned by his estate. The article concludes by suggesting that artists will have to wait for effective protections against AI threats.

Key takeaways:

  • Artists are facing a precarious time as AI image generators are getting better at replicating unique styles, potentially diluting artists' brands and replacing them in the market.
  • Glaze, a tool that adds a small amount of imperceptible-to-humans noise to images to stop image generators from copying artists' styles, is in high demand but is struggling to keep up with the surge in requests.
  • Security researchers have claimed that it is easy to bypass Glaze's protections, sparking a debate about its effectiveness in protecting artists' works.
  • Despite the concerns, many artists are still waiting for access to Glaze, indicating a desperate need for protections against AI mimicry in the art world.
View Full Article

Comments (0)

Be the first to comment!