Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Stability AI unveils smaller, more efficient 1.6B language model as part of ongoing innovation

Jan 20, 2024 - venturebeat.com
Stability AI has launched one of its smallest models yet, Stable LM 2 1.6B, a text content generation large language model (LLM). The model, which supports seven languages, aims to lower barriers and enable more developers to participate in the generative AI ecosystem. Despite its smaller size, the model outperforms other small language models with under 2 billion parameters on most benchmarks and even surpasses some larger models, including Stability AI’s own earlier Stable LM 3B model.

The new Stable LM 2 models are trained on more data, including multilingual documents in six languages in addition to English. Stability AI is also providing the new models in pre-trained and fine-tuned options, as well as a format described as the last model checkpoint before the pre-training cooldown. This is aimed at providing more tools for developers to innovate and build on top of the current model.

Key takeaways:

  • Stability AI has released one of its smallest models yet, the Stable LM 2 1.6B, a text content generation large language model (LLM).
  • The new model aims to lower barriers and enable more developers to participate in the generative AI ecosystem, incorporating multilingual data in seven languages.
  • Despite its size, the Stable LM 2 1.6B outperforms other small language models with under 2 billion parameters on most benchmarks, and even surpasses some larger models.
  • Stability AI is making the new models available with pre-trained and fine-tuned options, and a format that allows developers to further specialize the model for other tasks or datasets.
View Full Article

Comments (0)

Be the first to comment!