Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

The Information Suggests, in an Aside, That Apple Scrapped Work on a Quad-Max/Double-Ultra M-Series Chip

Dec 12, 2024 - daringfireball.net
Apple is developing its first AI server chip, internally code-named Baltra, in collaboration with Broadcom for networking technology, with mass production expected by 2026. Unlike Broadcom's typical arrangements, Apple is managing the chip's production with TSMC, highlighting a shift in Apple's approach compared to Google's past dealings with Broadcom. The AI chip development is led by Apple's silicon design team in Israel, which was instrumental in creating the processors that replaced Intel chips in Macs.

Additionally, Apple canceled the development of a high-performance Mac chip to allocate resources to the AI chip project, indicating a shift in priorities. The canceled chip, potentially an "M# Extreme," would have been a more powerful version of the M-series, designed for future Mac Pro models. This decision suggests that the introduction of such a high-end chip might be delayed, as Apple focuses on advancing its AI capabilities using TSMC's advanced N3P manufacturing process.

Key takeaways:

```html
  • Apple is developing its first AI server chip, internally code-named Baltra, in collaboration with Broadcom, with plans for mass production by 2026.
  • Apple's silicon design team in Israel is leading the development of the AI chip, and the company has shifted some engineering resources from a canceled high-performance Mac chip project to focus on this AI initiative.
  • The AI chip will utilize TSMC's advanced N3P manufacturing process, which is an improvement over the process used for Apple's latest M4 computer processors.
  • The cancellation of the high-performance Mac chip, which was expected to be a quad-Max "Extreme" version, suggests a shift in Apple's priorities towards AI development, potentially delaying the release of such a chip.
```
View Full Article

Comments (0)

Be the first to comment!