In a world increasingly shaped by artificial intelligence, few companies will leave their mark on 2024 like the open source project Hugging Face.
What started as a chatbot app has since evolved into a hub for open source AI, becoming an indispensable resource for researchers, developers, and businesses alike. After several investments, Hugging Face was valued at $4.5 billion by 2023.
Hugging Face is Emerge’s 2024 Project of the Year for its innovative role in AI and commitment to democratizing machine learning. We empower researchers and startups around the world through visionary leadership, open source tools, and a strong focus on ethics. Thanks to a vibrant online community of open source AI enthusiasts, Hugging Face has become the standard for responsible, collaborative AI innovation.
What is Hugging Face?
Hugging Face is an open source platform for machine learning and natural language processing, founded in 2016 by French entrepreneurs Clément Delangue, Julien Chaumond, and Thomas Wolf and based in New York.
With a massive library of over 1 million AI models, 190,000 datasets, and 55,000 demo apps, Hugging Face allows developers, researchers, and data scientists to build, train, share, and deploy AI models.
“We started out as a gaming company, but we realized we could have a much bigger impact if we started making some of our research code open source. As a result, we have a library of Transformers and are excited to see the impact and excitement around it in the community,” said Wolf, co-founder and chief scientific officer. decryption. “We believe that open source is a key approach to democratizing machine learning.”
At its core is the Transformers library, which provides state-of-the-art pre-trained models for a variety of tasks. Users can explore models through browser-based inference widgets, access them through APIs, and deploy them across computing environments. Hugging Face also fosters collaboration by allowing users to share and fine-tune models through the Hub, a central repository where they can experiment and contribute to cutting-edge AI models.
Fine-tuning AI means taking a pre-trained AI model with weights and features learned from an initial dataset to train the model and tuning it to perform a specific task or improve performance on special datasets.
“Open science and open source AI address today’s challenges such as preventing black box systems, increasing corporate accountability, mitigating bias, reducing misinformation, promoting copyright, and rewarding all stakeholders, including artists and content creators, in the value creation process. Helps you solve it. ” Co-founder and CEO Delangue said on X (formerly Twitter):
Democratizing AI
A common refrain in the decentralized and open source communities is to “democratize AI,” that is, to empower individuals to use AI for social good, innovation, and to solve complex problems without the control of corporations and governments.
In an industry dominated by proprietary technologies and closed ecosystems, Hugging Face stands out by making cutting-edge tools freely available to the global AI community. Delangue reiterated Hugging Face’s commitment to the cause of democratizing AI during a congressional hearing of the Science, Space, and Technology Committee in June 2023.
“Hugging Face is a community-oriented company based in the United States with a mission to democratize good machine learning,” Delangue said during the hearing. “We advance our mission primarily through open source and open science: a platform for hosting machine learning models and datasets, and an infrastructure to support research and resources to lower the barriers for all backgrounds to contribute to AI. We have it.”
The democratization of AI will have a particularly big impact on underserved regions and industries where researchers and small startups lack the resources to compete with the tech giants.
“The long-standing and widening resource gap, particularly between industry and academia, is limiting who can contribute to innovative research and applications,” Delangue told parliament. “We strongly support U.S. national AI research resources and provide resources to small businesses and startups conducting research in the public interest.”
cooperation rather than competition
Highlighting Hugging Face’s collaborative spirit, the company has collaborated with other big names in AI, including Google, AWS, Meta, Nvidia, and Microsoft.
Last January, Hugging Face collaborated with Google Cloud, combining its open model with Google’s infrastructure to make AI more accessible. That same month, Hugging Face introduced the Hallucinations Leaderboard, launched to address the ongoing AI hallucinations problem.
“The challenge now is to get enough startups and teams to deploy the model across a variety of industries,” Wolf said. “There is no need to wait for GPT-5. Now is the time to learn how to use, evaluate, and adapt these models to build AI applications in today’s world.”
Last May, Hugging Face expanded its partnership with Microsoft, starting in 2022, to provide developers with broader infrastructure and tools to create more robust versions of Copilot AI models. Later that month, Amazon announced a new partnership with Hugging Face to make it easier for developers to run AI models using Amazon’s computer chips.
Computer chip giant Nvidia announced in July that it was collaborating with Hugging Face to bring the Nvidia Accelerated Inference service to its open source platform, allowing developers to deploy AI models like Llama 3 with token processing speeds up to 5x faster. .
Last October, Hugging Face launched HuggingChat, a platform for OpenAI’s ChatGPT. HuggingChat allows users to choose from a diverse pool of open source AI models for text generation capabilities. This was followed by the launch of Hugging Face Generative AI Services (HUGS), which allows developers to deploy and train AI models offline in a personalized environment.
At the Robot Learning Conference in Germany last November, Hugging Face and NVIDIA announced a partnership to advance open source robotics by combining Hugging Face’s robotics platform LeRobot with NVIDIA’s AI tools to deliver a targeted combination of both simulation and real-world training. announced. Making robots smarter and more effective.
But it hasn’t always been smooth sailing for Hugging Face. Last November, the company faced backlash after it was revealed that a dataset containing over a million posts was created using content scraped from the emerging Bluesky social media platform before being removed the next day.
“Bluesky data has been removed from the repo. “I wanted to support the development of tools for the platform, but I recognize that this approach violates the principles of transparency and consent in data collection.” Hugging Face Machine Learning Librarian Daniel van Strein wrote for Bluesky. “We apologize for this mistake.”
The future of hugging face
As we move into 2025, Hugging Face’s CEO offers predictions for the coming year for AI, including the first major public protests related to AI, the market capitalization of major companies halving due to AI, and the arrival of over 100,000 personal AI robots. I did it. Pre-order.
“With 15 million AI builders joining Hugging Face, we will begin to see the economic and employment growth potential of AI,” Delangue said on Twitter.
Wolf shared a similarly optimistic view about the future of open source AI and robotics moving into 2025, pointing to more energy-efficient, open models.
“A lot of things excite me about the future. But just to name a few,” Wolff said. “Smaller models that can be much more energy efficient, the rise of open source robotics, and all the tools discovered in AI (e.g. weather prediction, materials discovery) have expanded into scientific fields.”
Hugging Face has played a pivotal role in the evolution of AI in 2024 by lowering the barriers for startups and developers to create diverse AI solutions while fostering innovation, global accessibility, and transparency.
generally intelligent newsletter
A weekly AI journey explained by Gen, a generative AI model.