Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Nvidia unveils H200, its newest high-end chip for training AI models

Nov 14, 2023 - cnbc.com
Nvidia has unveiled the H200, a new graphics processing unit (GPU) designed for training and deploying artificial intelligence models. The H200, an upgrade from the H100 chip used by OpenAI, includes 141GB of next-generation "HBM3" memory, enabling the chip to perform "inference" nearly twice as fast as the H100. The new GPU, expected to ship in Q2 2024, will compete with AMD's MI300X GPU and will be compatible with the H100, allowing AI companies to use the new version without changing their server systems or software.

However, the H200 may not remain the fastest Nvidia AI chip for long, as the company plans to move from a two-year architecture cadence to a one-year release pattern due to high demand for its GPUs. Nvidia is set to announce and release its B100 chip, based on the forthcoming Blackwell architecture, in 2024.

Key takeaways:

  • Nvidia has unveiled the H200, a new graphics processing unit designed for training and deploying artificial intelligence models. The new GPU is an upgrade from the H100, which was used by OpenAI to train its advanced language model, GPT-4.
  • The H200 includes 141GB of next-generation "HBM3" memory that will help the chip perform "inference," or using a large model after it's trained to generate text, images or predictions. It is expected to generate output nearly twice as fast as the H100.
  • The H200 is expected to ship in the second quarter of 2024 and will compete with AMD's MI300X GPU. It will be compatible with the H100, meaning that AI companies who are already training with the prior model won't need to change their server systems or software to use the new version.
  • Nvidia plans to move from a two-year architecture cadence to a one-year release pattern due to high demand for its GPUs. The company is expected to announce and release its B100 chip, based on the forthcoming Blackwell architecture, in 2024.
View Full Article

Comments (0)

Be the first to comment!