Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Nvidia CEO says AI models like DeepSeek's R1 need 100 times more compute

Feb 27, 2025 - businessinsider.com
Nvidia CEO Jensen Huang emphasized the increasing computational demands of reasoning models, which require 100 times more resources than traditional models, during an earnings call. Despite Nvidia surpassing revenue expectations, investor response was lukewarm, partly due to concerns about competition from Chinese firm DeepSeek's efficient open-source models. Huang dismissed fears that DeepSeek's models would reduce demand for Nvidia's computing power, highlighting that reasoning models like DeepSeek's R1 are resource-intensive and widely adopted by AI developers. He also noted that the majority of Nvidia's demand comes from inference, a type of AI computing crucial for refining models and generating responses.

The competition in the inference computing market is intensifying, with new companies like Tenstorrent and Etched receiving significant funding, and cloud providers developing custom AI chips. Analysts suggest that Nvidia's dominance in AI compute markets, particularly in inference, may be challenged as these competitors grow. While Nvidia still holds a substantial market share, there are concerns that it could decrease to 50% as the landscape evolves. Despite these challenges, Nvidia's latest chip generation, Blackwell, aims to enhance its capabilities in inference computing.

Key takeaways:

  • Nvidia CEO Jensen Huang stated that reasoning models require 100 times more computing resources than traditional models, with most demand coming from inference.
  • Despite Nvidia's strong revenue performance, investor response was lukewarm due to concerns about competition and the impact of DeepSeek's efficient open-source models.
  • Inference computing is becoming increasingly important, with Nvidia's new chip generation, Blackwell, enhancing this capability, but competition in this area is rising.
  • Analysts suggest that Nvidia's market share in inference could decrease as new competitors and custom AI chips from cloud companies emerge.
View Full Article

Comments (0)

Be the first to comment!