Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Google’s Hugging Face deal puts ‘supercomputer’ power behind open-source AI

Jan 25, 2024 - theverge.com
Google Cloud has partnered with AI model repository Hugging Face to allow developers to build, train, and deploy AI models without a Google Cloud subscription. Developers using Hugging Face’s platform will have cost-effective access to Google’s tensor processing units (TPU) and GPU supercomputers, including thousands of Nvidia’s H100s. Hugging Face, valued at $4.5 billion, hosts over 350,000 models on its platform and is known for storing open-sourced foundation models like Meta’s Llama 2 and Stability AI’s Stable Diffusion.

Hugging Face users can start using the AI app-building platform Vertex AI and the Kubernetes engine for model training and fine-tuning in the first half of 2024. Google's partnership with Hugging Face aims to further support the development of the open-source AI ecosystem. However, some of Google’s large language models like Gemini and the text-to-image model Imagen are not on Hugging Face and are considered more closed source models.

Key takeaways:

  • Google Cloud has partnered with AI model repository Hugging Face, allowing developers to build, train, and deploy AI models without a Google Cloud subscription.
  • Developers using Hugging Face’s platform will have cost-effective access to Google’s TPU and GPU supercomputers, including thousands of Nvidia’s H100s.
  • Hugging Face, valued at $4.5 billion, hosts over 350,000 models on its platform for developers to work with or upload their own models, similar to GitHub.
  • Google said Hugging Face users can start using the AI app-building platform Vertex AI and the Kubernetes engine for training and fine-tuning models in the first half of 2024.
View Full Article

Comments (0)

Be the first to comment!