Become essential for all AI actors, Hugging Face aims to also make robotics accessible to the greatest number. Sson co -founder details the ambitions of the startup.
JDN. Who uses Hugging today between researchers, developers, start-ups and large companies?
Thomas Wolf. Initially, Hugging Face was mainly addressed to researchers. Then the community expanded with the arrival of hobbyists. Over time, uses have balanced and many users use it both in a professional and personal setting.
Today, anyone who builds something in Hugging Face, whether independent developers, start-ups, large companies, NGOs or governments. In September, we launched the Enterprise Hub, a version of the hub designed for major accounts, which includes functions such as access control, private resources or reinforced security. We already have more than 2,000 client companies of all sizes, from Nvidia to young start-ups.
Does your platform also attract uninitiated AI, which are not necessarily engineers?
Effectively. Hugging Face is based on three large pillars. The first is the models, used by those who want to integrate AI into their applications without depending on closed solutions. The second is the datasets, useful to train these models. And the third is the applications, called “spaces”. There are almost 100,000, accessible to everyone.
We can see this as a kind of “AI AI AP AI”. Just write what you are looking for, like a tool to remove the background from a photo or generate a 3D character from a prompt, and you will find an application for this need. All of this works via a NO code interface, directly on our platform. Some of these apps are very successful, such as Comic Factory, which allows you to create comics.
Hugging Face offers many free features. How do you finance the platform? Should we expect a progressive restriction of access to certain options?
This is a question that we have been receiving for a long time, and the answer remains no. We have not planned to restrict access to free features, because our economic model works well. We generate income through consulting with very large accounts such as Amazon or Nvidia, as well as with our offer dedicated to companies, the Enterprise hub. We also offer individual subscriptions, for example for developers who wish to benefit from more computing power to execute their applications.
“Today we generate a positive cash flow”
The majority of the apps on the platform are developed by the community. Some run on our shared resources, others on dedicated servers if their creator has opted for a paid formula. In conclusion, our model is solid. Especially since we raised funds in August 2023 and today generate a positive cash flow.
Do you train your own major AI models internally?
No, we do not lead to so-called “frontier” models, like GPT-4. Conversely, we focus on smaller models, designed to work locally on your devices. We think their potential is enormous. I am personally very enthusiastic about developing lighter LLMs, which do not need to memorize the internet set and can rely on external tools to accomplish their tasks.
Do Optai still seem out of reach despite its colossal means?
Until last year, many thought that Optai had too much advance. But the arrival of Deepseek changed the perception of many actors. For the open source, it was a bit of the equivalent of the “chatgpt moment”, in terms of impact. The difference between open source models and closed models continues to be reduced. Of course, there will always be a slight gap. Everything is public in the open source, which allows closed actors to be inspired by them to offer a little better versions.
It is as if a student worked with an open book, and that his neighbor just had to read the answer and improve it. But this discrepancy becomes less and less significant. Today, for companies, the choice of model is no longer a central issue. They can go from GPT to Claude or Gemini without difficulty. And thanks to open source, they can choose their own infrastructure, their own chips, etc.
Why did you buy the Bordeaux start-up Pollen Robotics last April, and what are your ambitions in robotics?
Rémi Cadène, former Tesla, joined us in March from last year to develop software solutions dedicated to robotics. Our goal is to create a standard bookstore around the police, the algorithms that control robots.
“We were already working with Pollen Robotics, and the idea of integrating them was done naturally”
We launched an open source library, called Lerobot, in May 2024, aimed at standardizing data games and making robotics accessible thanks to AI. It begins to federate a real community, made up of researchers but also enthusiasts. More than 100 contributors participate, and our Discord server already brings together 8,000 members. We were already working with Pollen Robotics, and the idea of integrating them naturally. Pollen is an expert in open source hardware, we in open source software, complementarity was obvious.
You recently announced the release of two open source robots, Hopejr and Richie Mini. Who are they talking to?
With Hopejr, a humanoid at 3,000 euros, and Richie Mini, offered at 300 euros, we seek to offer robots at a relatively accessible price, especially if we compare with Optimus de Tesla, announced to several tens of thousands of dollars, or those of Boston Dynamics, which exceed hundreds of thousands. Our goal is to democratize robotics, especially with software developers. We want to offer them a platform, datasets, and affordable robots to experiment, learn or create. Before the takeover of pollen robotics, the latter already marketed Richie 2, a humanoid robot sold around 70,000 euros, intended for research laboratories or industrialists wishing, for example, to train it to automate certain tasks.
What is the involvement of Hugging facing in the open source robotic arms like the SO–100, the estimated cost of which is less than 100 euros?
Originally, this project comes from a member of our community, which adapted a robotic arm so that it is simple to build and inexpensive. We then published in Open Source the plans of the SO-100, a 3D printable arm that everyone can make yourself. Some companies have decided to produce it and offer selling kits, but on our side, we do not sell anything and we do not get any income from this project. We leave it to third -party companies to do so. Of course, anyone with a 3D printer can download the plans for free and assemble the arm an estimated cost less than 100 euros. In April 2025, we launched SO -101, a more robust version. The ready -to -use version is programmable and fully compatible with our Lerobot bookstore, at an estimated cost of less than 500 dollars.
Do you think robotics can become central in your activity?
I don’t think. Robotics remain a bet, while we live in an era dominated by internet, where the most used models will undoubtedly remain linked to textual uses such as LLMS. That said, we hope to see the open source also progress in the field of robotics, because there is a lot to explore, whether for entertainment, domestic tasks or industrial applications.
“Robotics remains a bet, while we live in an era dominated by Internet”
With our Lerobot bookstore, we seek to standardize datasets to allow researchers, entrepreneurs or hobbyists to train their robots from a common base. This facilitates data sharing and opens the way to the transfer of a dataset from one robot to another, which is still difficult today. By pooling these resources, we can create better models. In the longer term, robotics could also allow Hugging to diversify its income, for example with the sale of robots.
What are the main development axes for Hugging Face in the coming years?
We will continue to invest in small models, a largely abandoned segment by major technological players. These companies prefer to sell token to use, rather than directly supply the models to their customers. For our part, we want to continue to share as much as possible, by publishing recipes, training data games and all the necessary elements so that the community can create, adapt and reproduce these models. We are convinced that the more there is sharing, the more the entire ecosystem progresses.
Thomas Wolf is co -founder and scientific director of Hugging Face, where he pilots open source, educational and experimental initiatives. Defender of open science, he is co-author of the Natural Language Processing With Transformers (O’Reilly) book. A graduate of the École Polytechnique, he worked at Lawrence Berkeley National Lab before carrying out a thesis in physics at the Sorbonne and the ESPCI. He then exercised as a patent counsel after a law degree in Panthéon-Sorbonne




