Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Google announces Gemma 2, a 27B-parameter version of its open model, launching in June | TechCrunch

May 14, 2024 - techcrunch.com
Google announced several new additions to its Gemma model family at the Google I/O 2024 developer conference, including the launch of Gemma 2, a 27 billion parameter model set to release in June. The company also introduced PaliGemma, a pre-trained Gemma variant designed for image captioning, image labeling, and visual Q&A use cases. Prior to these releases, Gemma models were only available in 2-billion-parameter and 7-billion-parameter versions.

Josh Woodward, Google’s VP of Google Labs, stated that the Gemma models have been downloaded millions of times across various services. He emphasized that the new 27-billion model has been optimized to run on Nvidia’s next-gen GPUs, a single Google Cloud TPU host, and the managed Vertex AI service. While performance data for Gemma 2 is yet to be shared, Woodward noted that it's already outperforming models twice its size.

Key takeaways:

  • Google announced new additions to Gemma, its family of open models, at the Google I/O 2024 developer conference, including the launch of Gemma 2, a 27 billion parameter model.
  • PaliGemma, a pre-trained Gemma variant for image captioning, image labeling and visual Q&A use cases, is already available.
  • Josh Woodward, Google’s VP of Google Labs, noted that the Gemma models have been downloaded more than “millions of times” and the new 27-billion model is optimized to run on Nvidia’s next-gen GPUs, a single Google Cloud TPU host and the managed Vertex AI service.
  • Google claims that Gemma 2 is already outperforming models two times bigger than it, but more data and developer feedback is needed to confirm its performance.
View Full Article

Comments (0)

Be the first to comment!