Databricks presents its Mosaic AI Foundation Model product as the managed solution to these roadblocks, which in addition to running DBRX and other models provides a training stack for fine-tuning DBRX on custom data. Despite its limitations, Databricks VP of generative AI, Naveen Rao, promises that the company will continue to refine DBRX and release new versions as the company’s Mosaic Labs R&D team investigates new generative AI avenues. However, the model seems like a tough sell to anyone but current or would-be Databricks customers, given its high requirements and the competitive offerings from rivals in generative AI.
Key takeaways:
- Databricks has announced a new generative AI model, DBRX, which is available on GitHub and the AI dev platform Hugging Face for research and commercial use.
- The company spent roughly $10 million and eight months training DBRX, which it claims outperforms all existing open source models on standard benchmarks.
- However, using DBRX is challenging unless you're a Databricks customer due to hardware requirements and potential restrictions for companies with more than 700 million active users.
- Despite its limitations and challenges, Databricks plans to continue refining DBRX and release new versions as it explores new generative AI avenues.