Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Microsoft is finally making custom chips — and they’re all about AI

Nov 15, 2023 - theverge.com
Microsoft has developed its own custom AI chip, Azure Maia, and Arm-based CPU, Azure Cobalt, to power its Azure data centers and reduce reliance on Nvidia. The chips, set to arrive in 2024, are part of Microsoft's strategy to prepare for a future dominated by AI. The Azure Maia AI chip will be used for large language model training and inference, while the Azure Cobalt CPU, a 128-core chip, is designed to power general cloud services on Azure. Microsoft is currently testing the Cobalt CPU on workloads like Microsoft Teams and SQL server.

The Maia 100 AI accelerator is designed for running cloud AI workloads and will be used to power some of Microsoft's largest AI workloads on Azure, including parts of the partnership with OpenAI. The Azure Cobalt CPU is designed for power management and performance control per core and on every single virtual machine. Microsoft's initial testing shows that its performance is up to 40% better than commercial Arm servers currently in use in its data centers. However, Microsoft is not yet sharing full system specifications or benchmarks.

Key takeaways:

  • Microsoft has developed its own custom AI chip, Azure Maia AI, and Arm-based CPU, Azure Cobalt, to power its Azure data centers and reduce reliance on Nvidia's GPUs. The chips are expected to be released in 2024.
  • The Azure Cobalt CPU is a 128-core chip customized for Microsoft and designed to power general cloud services on Azure. It is currently being tested on Microsoft Teams and SQL server workloads, with plans to make virtual machines available to customers next year.
  • The Maia 100 AI accelerator is designed for running cloud AI workloads, like large language model training and inference. It will be used to power some of Microsoft’s largest AI workloads on Azure, including parts of the partnership with OpenAI.
  • Microsoft is not planning to sell these chips to others like Nvidia, AMD, Intel, and Qualcomm do. Instead, they are developed for its own Azure cloud workloads, potentially lowering the cost of AI for its customers.
View Full Article

Comments (0)

Be the first to comment!