The Olympus model could potentially reduce Amazon's reliance on Anthropic PBC, a company that has received $8 billion in funding from Amazon. The model could be integrated with Bedrock, a managed service that provides access to cloud-hosted frontier models, including more than half a dozen Amazon-developed models. Amazon's AI strategy also includes hardware, with the company developing two chip lineups, AWS Trainium and AWS Inferentia, optimized for training and inference workloads.
Key takeaways:
- Amazon has reportedly developed a multimodal large language model, known internally as Olympus, which can process text, images, and videos.
- The model is expected to debut as early as next week during AWS re:Invent and may be offered through Amazon Web Services, possibly via AWS Bedrock.
- Olympus could help users search video repositories for specific clips using natural language prompts and assist energy companies in analyzing geological data.
- The development of Olympus could be a move by Amazon to reduce its reliance on Anthropic PBC, a company it has funded, as other tech giants also work to bring more of their AI stacks in-house.