Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

More ethical AI? Fairly Trained launches to certify gen AI tools trained on licensed data

Jan 18, 2024 - venturebeat.com
A new non-profit organization, Fairly Trained, has been launched to support creators who believe their work should not be used in AI training without their prior consent. The organization, led by former Stability AI employee Ed Newton-Rex, aims to certify generative AI companies that train on data provided with the consent of its creators. Newton-Rex argues that AI training is a different proposition from human learning and should be treated as such, suggesting companies should change their approach and move to a licensing model.

Fairly Trained offers a "Licensed Model (L) certification for AI providers" to help consumers make informed decisions about the AI tools they use. The certification process involves an online form and a written submission process, with fees charged on a sliding scale based on the companies’ annual revenue. Several companies have already sought and received the L certification, including Beatoven.AI, Boomy, BRIA AI, Endel, LifeScore, Rightsify, Somms.ai, Soundful, and Tuney.

Key takeaways:

  • A new non-profit organization called "Fairly Trained" has been launched to support the idea that data creators and posters should be asked for consent before their work is used in AI training.
  • Ed Newton-Rex, the CEO of Fairly Trained, argues that AI training is a different proposition from human learning, and should be treated as such, advocating for a licensing model for AI training data.
  • Fairly Trained offers a "Licensed Model (L) certification for AI providers" to help consumers choose tools trained on data licensed expressly to AI companies, charging fees based on the companies’ annual revenue.
  • Several companies have already sought and received the L certification from Fairly Trained, including Beatoven.AI, Boomy, BRIA AI, Endel, LifeScore, Rightsify, Somms.ai, Soundful, and Tuney.
View Full Article

Comments (0)

Be the first to comment!