Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Pocket-Sized AI Models Could Unlock a New Era of Computing

May 25, 2024 - wired.com
The article discusses the development of smaller, more efficient AI models, specifically Phi-3-mini, developed by Microsoft. This model, part of the Phi family, is compact enough to run on a smartphone and has been tested to perform comparably to GPT-3.5, the model behind the first release of ChatGPT. Microsoft's researchers have found that being selective about the training data can improve an AI system's abilities without increasing its size, suggesting that future AI systems could become smarter without necessarily becoming larger.

The article also highlights the benefits of running AI models locally on devices such as smartphones and laptops. This approach reduces latency and potential outages, ensures data privacy, and could enable new use cases for AI, such as deep integration into a device's operating system. The author suggests that Apple might reveal a similar strategy at its upcoming WWDC conference, focusing on shrinking AI to fit into its customers' pockets rather than building larger cloud AI models like OpenAI and Google.

Key takeaways:

  • Microsoft has developed a family of smaller AI models, including Phi-3-mini, which can run on devices as small as a smartphone, demonstrating significant advancements in AI efficiency.
  • The Phi-3-mini model performs comparably to GPT-3.5, the model behind the first release of ChatGPT, according to Microsoft researchers.
  • Microsoft's approach involves being more selective about the training data for AI systems, which has shown to improve their abilities without increasing their size.
  • Running AI models locally on devices like smartphones and laptops could reduce latency, ensure data privacy, and open up new use cases for AI, moving away from the cloud-centric model.
View Full Article

Comments (0)

Be the first to comment!