Meta announced the production of both MTIA chips, despite the initial expectation that MTIA v1 would not be released until 2025. The company aims to expand the chip's capabilities to train generative AI like its Llama language models. Early tests show the new chip performs three times better than the first-generation version. Meta also reportedly plans to develop other AI chips, such as Artemis, designed specifically for inference. This comes as other tech giants like Google, Microsoft, and Amazon are also developing their own AI chips to meet the increasing demand for compute power.
Key takeaways:
- Meta is developing the next generation of its custom AI chips, the Meta Training and Inference Accelerator (MTIA), which will be more powerful and able to train its ranking models faster.
- The company announced MTIA v1 in May 2023 for data centers, and both MTIA chips are now in production, with the next-generation chip also likely targeting data centers.
- The new MTIA chip will have 256MB memory on-chip with 1.3GHz, and early tests show it performs three times better than the first-generation version across four models.
- Other AI companies, including Google, Microsoft, and Amazon, are also developing their own chips to meet the increasing demand for compute power, with Nvidia currently dominating the AI chip market.