The exact purpose of MAI-1 has not yet been determined, and its ideal use will depend on its performance. Microsoft is training the model using a large cluster of servers with Nvidia GPUs and compiling training data from various sources. Depending on progress, Microsoft may preview MAI-1 as early as its Build developer conference later this month. This development suggests a dual approach to AI within Microsoft, focusing on both small locally run language models for mobile devices and larger state-of-the-art models powered by the cloud.
Key takeaways:
- Microsoft is developing a new large-scale AI language model called MAI-1, which could potentially rival models from Google, Anthropic, and OpenAI. This is the first time Microsoft has developed an in-house AI model of this scale since investing over $10 billion in OpenAI.
- The development of MAI-1 is being led by Mustafa Suleyman, the former Google AI leader and CEO of the AI startup Inflection, which Microsoft acquired for $650 million in March.
- MAI-1 is expected to have approximately 500 billion parameters, making it significantly larger than Microsoft's previous open source models and placing it in a similar league as OpenAI's GPT-4.
- The exact purpose of MAI-1 has not been determined yet, and its most ideal use will depend on its performance. Microsoft may preview MAI-1 as early as its Build developer conference later this month.