Jensen Huang: AI Has To Do '100 Times More' Computation Now Than When ChatGPT Was Released - Slashdot
Feb 27, 2025 - slashdot.org
In an interview with CNBC, Nvidia CEO Jensen Huang discussed the increasing computational demands of next-generation AI models, which require 100 times more compute power due to advanced reasoning processes. He highlighted models like DeepSeek's R1, OpenAI's GPT-4, and xAI's Grok 3 as examples that utilize these reasoning approaches. Huang praised DeepSeek for open-sourcing a world-class reasoning model and noted that Nvidia's revenue from China has decreased by about half due to export restrictions and competition from companies like Huawei.
Huang also mentioned that developers might circumvent export controls through software adaptations across various platforms, emphasizing that "software finds a way." He pointed out that Nvidia's GB200 chip, available in the U.S., can generate AI content 60 times faster than the versions sold in China under export restrictions.
Key takeaways:
Nvidia CEO Jensen Huang stated that next-gen AI will require 100 times more compute due to new reasoning approaches.
Huang highlighted models like DeepSeek's R1, OpenAI's GPT-4, and xAI's Grok 3 as examples of reasoning models needing more computational power.
Nvidia's revenue in China has decreased by half due to export restrictions and competition from companies like Huawei.
Huang mentioned that Nvidia's GB200 chip in the U.S. can generate AI content 60 times faster than the versions sold to China under export controls.