Despite the shift towards AI inference, Huang stated that Nvidia remains the largest inference platform globally, and its scale and reliability give it a significant advantage over startups. He also noted that while most of Nvidia's current computing workloads are around pre-training AI models, he expects more AI inference to occur as more people run AI models. However, some AI executives have warned that these methods are beginning to show diminishing returns.
Key takeaways:
- Nvidia reported over $19 billion in net income last quarter, but investors are concerned about its ability to maintain rapid growth.
- CEO Jensen Huang believes that "test-time scaling," a method used to improve AI models, could play a larger role in Nvidia's business moving forward.
- While Nvidia's chips are currently the gold standard for training AI models, the increased emphasis on AI inference could lead to more competition from startups like Groq and Cerebras.
- Huang reassured investors that Nvidia is well-positioned for the shift towards more AI inference, stating that Nvidia is the largest inference platform in the world today.