Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

DeepSeek fuels Nvidia H200 demand: What cloud firms are saying

Jan 31, 2025 - businessinsider.com
Cloud and inference providers are experiencing a surge in demand for Nvidia H200 chips due to the popularity of DeepSeek's AI models, particularly the R1 reasoning model. Despite an initial stock market sell-off for Nvidia, the demand for H200 chips has risen as DeepSeek's open-source models require powerful hardware for inference. Companies like Lambda and Valdi report increased pre-purchases and interest from startups and large enterprises alike, as these models offer significant performance and reasoning innovations. The H200 is currently the only widely available Nvidia chip capable of running DeepSeek's V3 model in its entirety on a single node, making it highly sought after.

DeepSeek's models, although trained on less powerful hardware, are compute-intensive for inference, requiring substantial GPU capacity. The models have fewer parameters than some competitors but are still cheaper to run, attracting firms eager to leverage their capabilities. The demand for H200 chips is driven by the need for high-speed inference and the ability to run the full model efficiently. As DeepSeek's models become more accessible through major cloud platforms, the market is reacting by securing the best GPUs available, indicating a potential long-term increase in demand beyond the initial hype.

Key takeaways:

  • Cloud and inference providers are experiencing increased demand for Nvidia H200 chips due to DeepSeek's AI models.
  • DeepSeek's open-source models require powerful hardware for inference, driving demand for Nvidia's H200 GPUs.
  • Despite Nvidia's stock decline, DeepSeek's model performance has spurred interest and pre-purchases of H200 capacity.
  • DeepSeek's models are efficient and cheaper to run, offering a competitive advantage in AI infrastructure.
View Full Article

Comments (0)

Be the first to comment!