Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Why Nvidia and AMD are roasting each other over AI performance claims

Dec 21, 2023 - theregister.com
The article discusses the ongoing dispute between Nvidia and AMD over the performance of their respective GPUs, specifically in the context of AI inferencing. Nvidia has criticized AMD's benchmarks for its newly launched MI300X GPUs, claiming that they do not take into account Nvidia's optimized software or the H100's support for FP8 data types. AMD has responded by arguing that Nvidia's benchmarks are not a fair comparison, and has presented updated performance figures that claim to show the MI300X outperforming the H100 even when using Nvidia's preferred software stack and FP8 precision.

The article also highlights the growing importance of software libraries and frameworks in boosting AI performance, with both companies claiming significant performance improvements through software optimizations. However, it also notes that hardware remains a critical factor, with AMD's MI300X having a significant advantage in terms of memory capacity and bandwidth. The article concludes by noting that both Nvidia and Intel are set to release new GPUs in the coming year, which will further intensify the competition in the AI accelerator market.

Key takeaways:

  • Nvidia and AMD are in a dispute over the performance of their respective GPUs, with Nvidia criticizing AMD's benchmarks for not taking advantage of its optimized software or the H100's support for FP8 data types.
  • AMD responded by arguing that Nvidia's benchmarks aren't an apples-to-apples comparison, and claimed that even when using Nvidia's preferred software stack and FP8 precision, the MI300X can achieve comparable performance.
  • The dispute highlights the growing importance of software libraries and frameworks in boosting AI performance, with both companies claiming significant performance improvements through software optimizations.
  • AMD's MI300X has a significant memory advantage over Nvidia's H100, which is a crucial factor in AI inferencing workloads. However, Nvidia's upcoming H200 GPU will narrow this gap.
View Full Article

Comments (0)

Be the first to comment!