Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

X.ai's Grok-1 Model is Officially Open-Source and Larger Than Expected

Mar 18, 2024 - synthedia.substack.com
X.ai has announced that its Grok 1 large language model (LLM) is now available under the open-source Apache 2.0 license, allowing users royalty-free access to the source code for both commercial and private uses. The Grok-1 model, which concluded its pre-training phase in October 2023, is not fine-tuned for any specific application and is a 314 billion-parameter model, larger than GPT-3/3.5 but likely smaller than GPT-4. The model's architecture is based on a Mixture-of-experts (MoE) design, which is believed to be a more efficient method to scale to higher performance than increasing parameter counts.

The release of Grok-1 as open-source is a significant move, contrasting with OpenAI’s proprietary-only offerings. However, it's not the same as the model used for the Grok AI assistant, meaning X.ai now has both open-source and proprietary models. This development could potentially encourage other companies in the AI space to support open-source developers.

Key takeaways:

  • X.ai has announced that the Grok 1 large language model (LLM) is now available under the open-source Apache 2.0 license, allowing users royalty-free access to the source code for both commercial and private uses.
  • The Grok-1 model is a pre-training model, not fine-tuned for any specific application, and is a 314 billion-parameter model, larger than GPT-3/3.5 but likely smaller than GPT-4.
  • Grok-1's architecture is based on a Mixture-of-experts (MoE) design, which is considered a more efficient method to scale to higher performance than loading up parameter counts.
  • Despite the open-source release of Grok-1, X.ai maintains both proprietary and open-source models, a move that contrasts with the proprietary-only offerings of OpenAI.
View Full Article

Comments (0)

Be the first to comment!