Ampere is also working with Qualcomm to optimize a joint solution, known as a super micro server, for AI inferencing. The new 256 core AmpereOne CPU will use the same air-cooled thermal solutions as the existing 192 core CPU and deliver over 40% more performance than any CPU currently available. The company also revealed that its CPUs outperform AMD's Genoa by 50% and Bergamo by 15%. Ampere's CPUs are also being used by Meta's Llama 3 on Oracle Cloud, delivering the same performance as an Nvidia A10 GPU paired with an x86 CPU, but using a third of the power.
Key takeaways:
- Ampere Computing announced that its AmpereOne chip family will grow to 256 cores by next year, providing 40% more performance than any CPU currently on the market.
- Ampere is collaborating with Qualcomm Technologies to develop a joint solution for AI inferencing using Qualcomm’s high-performance, low power Qualcomm Cloud AI 100 inference solutions and Ampere CPUs.
- Ampere has an ambitious roadmap for CPUs for the data center, with a 12-channel 256 core CPU ready to go on the TSMC N3 manufacturing process node.
- Ampere's CPUs outpace AMD Genoa by 50% and Bergamo by 15% in terms of performance per watt, and the company is working with Oracle to run huge models in the AI cloud, reducing costs by 28% and power consumption by a third compared to rival Nvidia solutions.