Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Generative AI models can plagiarize images with simple phrase prompts

Jan 07, 2024 - businessinsider.com
AI researchers Gary Marcus and Reid Southen have discovered that generative AI models can produce near replicas of trademarked characters from simple text prompts. The researchers tested two visual AI models, Midjourney and Dall-E 3, and found that both could reproduce almost exact images from movies and video games, even with indirect prompts. This raises concerns about the capacity of generative AI models for plagiarism, especially as the relationship between inputs and outputs in these models is not clear to end users.

The researchers argue that the burden of preventing copyright infringement currently falls on artists or image owners, which is problematic. They suggest that AI models should remove copyrighted works from their training data, filter out problematic queries, or list the sources used to generate images. They also recommend that AI models should only use properly licensed training data until a better solution for reporting the origin of images and filtering out copyright violations is found.

Key takeaways:

  • AI researchers have found that generative AI models can produce near replicas of trademarked characters with simple text prompts, potentially leading to copyright infringement.
  • Two visual AI models, Midjourney and Dall-E 3, were tested and found capable of reproducing almost exact images from movies and video games even with brief and indirect prompts.
  • The study raises concerns about generative AI models' capacity for plagiarism, as the relationship between the inputs and outputs isn't entirely clear to end users, making it hard to predict when a model will generate a plagiaristic response.
  • The authors suggest that AI models could remove copyrighted works from their training data, filter out problematic queries, or list the sources used to generate images to prevent copyright infringement.
View Full Article

Comments (0)

Be the first to comment!