The company is positioning itself as a platform provider rather than just a chip provider, with the aim of becoming the go-to supplier for AI companies. The new software and chips are part of this strategy, with the software allowing developers to run their models on all Nvidia GPUs and the chips providing the necessary power for training and deploying large AI models. Major companies like Amazon, Google, Microsoft, and Oracle will sell access to the GB200 through cloud services.
Key takeaways:
- Nvidia has announced a new generation of artificial intelligence chips and software, including the Blackwell graphics processors and the GB200 chip, which will ship later this year.
- The company also introduced a revenue-generating software called NIM, designed to make it easier to deploy AI and run programs on any of Nvidia's GPUs.
- Nvidia's Blackwell-based processors, like the GB200, offer a significant performance upgrade for AI companies, with 20 petaflops in AI performance compared to 4 petaflops for the previous H100 chip.
- Amazon, Google, Microsoft, and Oracle will sell access to the GB200 through cloud services, and Amazon Web Services will build a server cluster with 20,000 GB200 chips.