Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Apple releases eight small AI language models aimed at on-device use

Apr 26, 2024 - arstechnica.com
Apple has introduced a set of small AI language models named OpenELM that can run directly on a smartphone. These models, which range from 270 million to 3 billion parameters, are currently available on the Hugging Face under an Apple Sample Code License. The models were trained on publicly available datasets and Apple's approach includes a "layer-wise scaling strategy" that reportedly allocates parameters more efficiently, improving the model's performance while being trained on fewer tokens.

Apple has not yet integrated these AI language models into its consumer devices, but the upcoming iOS 18 update is rumored to include new AI features that utilize on-device processing to ensure user privacy. However, Apple may potentially hire Google or OpenAI to handle more complex, off-device AI processing to give Siri a boost. The company has also released the code for CoreNet, a library used to train OpenELM, and reproducible training recipes, aiming to "empower and enrich the open research community."

Key takeaways:

  • Apple has introduced a set of tiny source-available AI language models called OpenELM that are small enough to run directly on a smartphone.
  • The OpenELM models are currently available on the Hugging Face under an Apple Sample Code License, and they range from 270 million to 3 billion parameters in eight distinct models.
  • Apple's approach with OpenELM includes a 'layer-wise scaling strategy' that reportedly allocates parameters more efficiently across each layer, improving the model's performance while being trained on fewer tokens.
  • While Apple has not yet integrated this new wave of AI language model capabilities into its consumer devices, the upcoming iOS 18 update is rumored to include new AI features that utilize on-device processing to ensure user privacy.
View Full Article

Comments (0)

Be the first to comment!