Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Kin.art launches free tool to prevent GenAI models from training on artwork | TechCrunch

Jan 23, 2024 - techcrunch.com
The article discusses the issue of AI models using artists' work without their knowledge or permission for training. To combat this, tools are being developed to enable artists to modify their artwork so it can't be used in training AI models. One such tool is Nightshade, which alters the pixels of an image to trick models into misinterpreting the image. Another tool, Kin.art, co-developed by Flor Ronsmans De Vry, uses image segmentation and tag randomization to interfere with the model training process.

Kin.art's tool is free but requires artists to upload their artwork to Kin.art’s portfolio platform. While this could lead artists towards Kin.art’s fee-based art commission services, Ronsmans De Vry insists the effort is largely philanthropic. He plans to offer the tool to third parties in the future to help protect their data from unlicensed use, especially for platforms that need to provide public-facing services and can't gate their data.

Key takeaways:

  • Artists and activists are developing tools to prevent their artwork from being used without permission in the training of Generative AI models. Tools like Nightshade and Kin.art modify the artwork or its metadata to disrupt the model training process.
  • Kin.art's tool, co-developed by Flor Ronsmans De Vry, uses image segmentation and tag randomization to interfere with the model training process. It was launched to protect artists' rights and to promote an ethical approach to AI training.
  • Ronsmans De Vry asserts that Kin.art's tool is superior to existing solutions as it doesn't require expensive cryptographic modifications to images. It can also be combined with other methods for additional protection.
  • While the tool is free, artists have to upload their artwork to Kin.art's platform to use it. The company plans to offer the tool as a service to other platforms in the future, to help them protect their data from unlicensed use.
View Full Article

Comments (0)

Be the first to comment!