Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

xAI open-sources base model of Grok, but without any training code | TechCrunch

Mar 18, 2024 - techcrunch.com
Elon Musk's xAI has open-sourced the base code of the Grok AI model, a 314 billion parameter Mixture-of-Expert model, on GitHub. However, the model, which is licensed under the Apache License 2.0, was released without any training code and wasn't tuned for any specific application. The Grok-1 model was trained on a custom stack, but the details were not specified. The company had previously released Grok in a chatbot form for Premium+ users of X social network, but the open-source model doesn't include connections to the social network.

Many companies, including Meta and Google, have open-sourced their AI models. Some AI tool makers, like Perplexity, are already planning to use Grok in their solutions. Perplexity's CEO, Arvind Srinivas, announced that they will fine-tune Grok for conversational search and make it available to Pro users. Meanwhile, Musk is in a legal dispute with OpenAI over the betrayal of the nonprofit AI goal and has publicly criticized the company and its president, Sam Altman.

Key takeaways:

  • Elon Musk's xAI has open-sourced the base code of Grok AI model, a 314 billion parameter Mixture-of-Expert model, but without any training code.
  • The Grok model is licensed under Apache License 2.0, allowing for commercial use, and was not specifically tuned for any particular application.
  • Perplexity CEO Arvind Srinivas announced plans to fine-tune Grok for conversational search and make it available to Pro users.
  • Musk has been in a legal dispute with OpenAI, accusing the company of betraying the nonprofit AI goal.
View Full Article

Comments (0)

Be the first to comment!