A rising figure of the AI, Nick Frosst co -founded cohere after a remarkable passage at Google Brain alongside Geoffrey Hinton. For the JDN, he deciphers the resolutely “Enterprise Focus” strategy which guides Cohere.
JDN. You present your family of “command” models as designed natively for companies. Concretely, what distinguishes your models from other publishers?
Nick Frosst. We have a very concrete answer to this question: our model is extremely easy to deploy. It can operate on only two GPUs. If you compare this to another public model like Deepseek, which we significantly outdo, Deepseek requires between 16 and 32 GPUs to be deployed. We are therefore eight to sixteen times more effective in terms of deployment than Deepseek. It is crucial for us because we offer on-premise deployments. When we provide a copy of our model to a company, it must have the necessary equipment to make it work. This is why we have designed a model as easy to deploy.
Openai and Anthropic, for example, do not offer private deployments and have not published the weights of their models, so we do not know their exact size. But they are probably larger than Deepseek. This is really a central point for us.
“We do not focus on the features that companies do not need”
Finally, we have other priorities: we train our model only on elements relevant to the company. Our model excels in many old programming languages which are not essential for the general public but remain critical for certain large companies. For example, our model masters cobol well, which may seem anachronistic, but that illustrates the type of needs on which we are focusing.
Conversely, we do not focus on the features that companies do not need. Our model does not generate images, for example, because it is not a priority for businesses. This approach allows us to save parameters and this is how we manage to create a model that obtains excellent results on the benchmarks that matter to us, while requiring only two GPUs.
An interesting feature, you make your open source models, but only for research purposes. Do you think open source is the right way to promote innovation?
We make the weights of our open source models so that everyone can consult them. We also offer a research license to contribute to the development of the community. It really facilitates adoption. If people are curious about our model, want to assess their performance or understand how its deployment would happen, we can tell them: the weights are available, you can try it now. This approach made it possible to build a relationship of trust with users.
Let’s talk about your customers. Do you rather aim for multinationals, ETIs, or do you also have SMEs among your customers?
We cover the whole spectrum. We have some of the largest global companies among our customers: LG, Fujitsu, Oracle, RBC, STC, to name a few. In Europe, SAP is a major client, probably our biggest customer. But we also work with more modest companies. Some of these smaller companies are interested in our North deployments (an agency platform, editor’s note). Others simply use a copy of the model. We also offer an API that developers use to create startups and develop various projects.
How do you monetize your technology today? What are your main income models?
A large part of our income comes from work with large companies. We provide them with either a copy of the model, as we did for Oracle or Fujitsu, or we adapt this model to the languages that interest them. We have invested a lot in multilingualism – our model master 23 languages, including French. We then customize more according to the needs of the customer. For example, we have developed an improved version of our Japanese model for Fujitsu, and an optimized version in Korean for LG. Today, most of our income comes from these important contracts with companies, for which we create a tailor -made model where we redeploy North in their environment. Our main product is our command model is the one we market. But when we work with a partner to personalize it, he gets a unique model.




