Google’s Hugging Face deal puts ‘supercomputer’ power behind open-source AI
Hugging Face is one of the more popular AI model repositories, storing open-sourced foundation models like Meta’s Llama 2 and Stability AI’s Stable Diffusion. It also has many databases for model training.
There are over 350,000 models hosted on the platform for developers to work with or upload their own models to Hugging Face, much like coders put their code up on GitHub. Valued at $4.5 billion, Hugging Face has seen Google, Amazon, Nvidia, and others help raise $235 million over the last year.
Google said that Hugging Face users can begin using the AI app-building platform Vertex AI and the Kubernetes engine that helps train and fine-tune models “in the first half of 2024.”
Google said in a statement that its partnership with Hugging Face “furthers Google Cloud’s support for open-source AI ecosystem development.” Some of Google’s models are on Hugging Face, but its banner large language models like Gemini, which now powers the chatbot Bard, and the text-to-image model Imagen are not on the repository and are considered more closed source models.
Read the full article Here