However, each variant will take up more than 2GB of space on a local system, so users need to manage their storage carefully. Opera is not working on saving storage while downloading a model. The company has been experimenting with AI-powered features since last year, launching an assistant called Aria and announcing plans to build an AI-powered browser with its own engine for iOS.
Key takeaways:
- Opera is now allowing users to download and use Large Language Models (LLMs) locally on their computers, starting with Opera One users who get developer stream updates.
- Users can select from over 150 models from more than 50 families, including Llama from Meta, Gemma from Google, and Vicuna.
- The feature is part of Opera’s AI Feature Drops Program and uses the Ollama open-source framework to run these models.
- Opera warns that each variant would take up more than 2GB of space on local systems, and the company is not doing any work to save storage while downloading a model.