Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Nvidia Conquers Latest AI Tests​

Jun 13, 2024 - spectrum.ieee.org
Nvidia's Hopper architecture has once again dominated the results in the latest set of training tests released by MLPerf, the AI benchmarking suite. The tests included fine-tuning of large language models and graph neural networks. A system that included 11,616 Nvidia H100 GPUs topped each of the nine benchmarks, setting records in five of them. Despite the participation of computers using Google’s and Intel’s AI accelerators, Nvidia's system showed a 3.2-fold improvement compared to last year's largest entrant on GPT-3.

Nvidia's success is attributed to software improvements, with the company logging a 27% improvement from the June 2023 MLPerf benchmarks for GPT-3 training. The company also implemented a scheme called flash attention, an algorithm that speeds transformer networks by minimizing writes to memory, which shaved as much as 10% from training times. However, upcoming training rounds in 2025 may see head-to-head contests comparing new accelerators from AMD, Intel, and Nvidia.

Key takeaways:

  • Nvidia continues to dominate machine learning benchmarks, including the new tests released by MLPerf, which focus on fine-tuning of large language models and graph neural networks.
  • Despite using the same Hopper architecture as last year, Nvidia managed to boost training times due to software improvements, achieving a 27 percent improvement from the June 2023 MLPerf benchmarks.
  • MLPerf added new benchmarks this year, including fine-tuning and graph neural networks, to stay relevant to the AI industry's developments.
  • Future training rounds in 2025 may see competitions between new accelerators from AMD, Intel, and Nvidia, with Nvidia planning to unveil a new architecture, Blackwell, later this year.
View Full Article

Comments (0)

Be the first to comment!