Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

OpenAI accidentally deleted potential evidence in NY Times copyright lawsuit (updated) | TechCrunch

Nov 24, 2024 - techcrunch.com
The New York Times and Daily News are suing OpenAI for allegedly using their content without permission to train its AI models. Lawyers for the publishers claim that OpenAI engineers accidentally deleted data relevant to the case, which was stored on virtual machines provided by OpenAI for the publishers to search for their copyrighted content. The data was mostly recovered, but the loss of folder structure and file names means it cannot be used to determine where the publishers' articles were used in OpenAI's models. The publishers' lawyers argue that this incident shows OpenAI is best placed to search its own datasets for potentially infringing content.

In response, OpenAI's lawyers denied deleting any evidence and suggested the plaintiffs were responsible for a system misconfiguration that led to the technical issue. They also maintained that using publicly available data, including articles from the publishers, to train models is fair use. OpenAI has signed licensing deals with several publishers, but has not confirmed or denied using specific copyrighted works without permission.

Key takeaways:

  • The New York Times and Daily News are suing OpenAI for allegedly using their content to train its AI models without permission.
  • OpenAI engineers accidentally deleted data that could have been relevant to the case, making it difficult for the plaintiffs to determine where their copyrighted content was used in OpenAI's models.
  • OpenAI denies intentionally deleting any evidence and suggests that the plaintiffs are to blame for a system misconfiguration that led to the data loss.
  • OpenAI maintains that training models using publicly available data, including articles from The Times and Daily News, is fair use and doesn't require licensing or payment, even if it profits from those models.
View Full Article

Comments (0)

Be the first to comment!