Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Nvidia is launching a new must-have AI chip — as customers still scramble for its last one

Nov 13, 2023 - theverge.com
Nvidia is launching a new top-tier chip for AI work, the HGX H200, which improves on the H100 with 1.4x more memory bandwidth and 1.8x more memory capacity. The H200 is the first GPU to use a faster memory spec called HBM3e, increasing the GPU’s memory bandwidth to 4.8 terabytes per second and its total memory capacity to 141GB. The first H200 chips will be released in the second quarter of 2024, and Nvidia is collaborating with global system manufacturers and cloud service providers to make them available.

The H200 is designed to be compatible with systems that already support H100s, meaning cloud providers won't need to make any changes as they integrate H200s. The cloud divisions of Amazon, Google, Microsoft, and Oracle will be among the first to offer the new GPUs next year. However, the new chips are expected to be expensive, with the previous generation H100s estimated to sell for between $25,000 to $40,000 each.

Key takeaways:

  • Nvidia is launching a new top-tier chip for AI work, the HGX H200, which upgrades the H100 with 1.4x more memory bandwidth and 1.8x more memory capacity.
  • The H200 is the first GPU to use a new, faster memory spec called HBM3e, increasing the GPU’s memory bandwidth to 4.8 terabytes per second and its total memory capacity to 141GB.
  • The H200 is built to be compatible with the same systems that support H100s, and cloud providers won't need to make any changes as they add H200s into their systems.
  • Nvidia plans to triple its production of the H100 in 2024, aiming to produce up to 2 million of them next year, up from around 500,000 in 2023.
View Full Article

Comments (0)

Be the first to comment!