Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Amazon Reportedly Training AI With Twice as Many Parameters as GPT-4

Nov 08, 2023 - futurism.com
Amazon is reportedly developing a large language model (LLM) codenamed "Olympus" with 2 trillion parameters, double the number of OpenAI's GPT-4 LLM. The company is investing heavily in the project, which is expected to compete with other tech giants' AI models like ChatGPT and Google's Bard. Despite the large number of parameters, it is still uncertain whether Olympus will outperform its competitors.

The release date for Olympus is still unknown. However, Amazon's significant resources and dominance in web hosting make it a company to watch in the AI industry. The company has also invested $4 billion in AI startup Anthropic earlier this year. Despite the high expectations, experts warn that a model with more parameters does not necessarily perform better.

Key takeaways:

  • Amazon is reportedly developing a large language model (LLM) codenamed "Olympus", which is expected to have 2 trillion parameters, twice as many as OpenAI's GPT-4 LLM.
  • The company's existing infrastructure and resources, including its dominance in the web hosting space, make it a strong contender in the AI industry.
  • Despite the large number of parameters, it is not guaranteed that Olympus will outperform other models, as a model with more parameters is not necessarily better, according to AI expert Yann LeCun.
  • Amazon has shown its commitment to AI development through significant investments, including a $4 billion investment in AI startup Anthropic earlier this year.
View Full Article

Comments (0)

Be the first to comment!