AMD has already secured some major clients for its new chip. Meta and Microsoft, the two largest purchasers of Nvidia H100 GPUs in 2023, have committed to using the MI300X for various AI workloads. Microsoft will offer access to the chips through its Azure web service, and Oracle's cloud will also use the chips. OpenAI will support AMD GPUs in its software product, Triton.
Key takeaways:
- AMD, Meta, OpenAI, and Microsoft announced at an AMD investor event that they will use AMD's newest AI chip, the Instinct MI300X, as an alternative to Nvidia's expensive graphic processors.
- The Instinct MI300X is based on a new architecture and has 192GB of a high-performance memory type known as HBM3, which transfers data faster and can fit larger AI models.
- AMD has improved its software suite called ROCm to compete with Nvidia's industry standard CUDA software, addressing a key shortcoming that had been one of the primary reasons AI developers currently prefer Nvidia.
- Meta and Microsoft, the two largest purchasers of Nvidia H100 GPUs in 2023, along with OpenAI and Oracle's cloud, have already signed up to use the new AMD chip.