Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Opera allows users to download and use LLMs locally | TechCrunch

Apr 03, 2024 - techcrunch.com
Opera, the web browser company, has announced that it will now allow users to download and use Large Language Models (LLMs) locally on their computers. This feature, which is initially being rolled out to Opera One users who receive developer stream updates, will enable users to choose from over 150 models from more than 50 families, including Llama from Meta, Gemma from Google, and Vicuna. The feature is part of Opera’s AI Feature Drops Program, which gives users early access to AI features. Opera is using the Ollama open-source framework to run these models.

However, each variant will take up more than 2GB of space on a local system, so users need to manage their storage carefully. Opera is not working on saving storage while downloading a model. The company has been experimenting with AI-powered features since last year, launching an assistant called Aria and announcing plans to build an AI-powered browser with its own engine for iOS.

Key takeaways:

  • Opera is now allowing users to download and use Large Language Models (LLMs) locally on their computers, starting with Opera One users who get developer stream updates.
  • Users can select from over 150 models from more than 50 families, including Llama from Meta, Gemma from Google, and Vicuna.
  • The feature is part of Opera’s AI Feature Drops Program and uses the Ollama open-source framework to run these models.
  • Opera warns that each variant would take up more than 2GB of space on local systems, and the company is not doing any work to save storage while downloading a model.
View Full Article

Comments (0)

Be the first to comment!