Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

OpenAI Reportedly Hitting Law of Diminishing Returns as It Pours Computing Resources Into AI

Nov 12, 2024 - futurism.com
OpenAI, known for its large language models (LLMs) like ChatGPT, is reportedly hitting a wall in its efforts to scale up its models, according to recently-exited cofounder Ilya Sutskever. Despite pouring more computing power into these models, Sutskever suggests that the firm's recent tests indicate a plateau in progress. This aligns with recent claims that AI companies, particularly OpenAI, are encountering the law of diminishing returns.

Over the weekend, it was reported that OpenAI is experiencing a slowdown in the advancement of its new flagship models. This challenges the core belief that AI models will continue to scale at a consistent rate as long as there is more data and computing power. Data scientist Yam Peleg suggests that the focus is now on data quality rather than quantity, indicating that major players have reached the limits of training longer and collecting more data.

Key takeaways:

  • OpenAI is reportedly facing challenges as it attempts to scale up its large language models (LLMs) like ChatGPT, with efforts seemingly hitting a plateau, according to cofounder Ilya Sutskever.
  • Sutskever's comments suggest that AI companies, including OpenAI, may be encountering the law of diminishing returns as they continue to pour resources into AI development.
  • Reports suggest that with each new flagship model, OpenAI is seeing a slowdown in the sort of "leaps" users have come to expect since the release of ChatGPT in December 2022.
  • Data scientist Yam Peleg suggests that the focus in AI development is now shifting towards data quality, as companies have likely reached the limits of training longer and collecting more data.
View Full Article

Comments (0)

Be the first to comment!