AI enters a frantic race for giant ecosystems

AI enters a frantic race for giant ecosystems

AI leaders are multiplying agreements to enrich their respective ecosystems and maintain their growth in a market that requires colossal resources in terms of energy, computing power and talent.

One more. Less than a month after announcing its partnership with Nvidia, OpenAI has signed a new agreement with a semiconductor giant. This time it’s AMD, which is currently Nvidia’s only serious rival in chips dedicated to cutting-edge AI. OpenAI is committed to purchasing 6 gigawatts of chips, which represents several tens of billions of dollars of investment, and to taking a 10% stake in AMD’s capital, which is around $270 billion.

This announcement once again demonstrates that AMD is now a player to reckon with in the ecosystem of large language models. This company founded in 1969, made obsolete barely ten years ago, has achieved a formidable recovery under the direction of Lisa Su, the current general manager (who is also the cousin of Jensen Huang, the boss of Nvidia), to the point of being today the only company capable of offering an ecosystem rivaling that of Nvidia on AI.

OpenAI leads the way

While the deal shows that AMD is a player worth seriously considering, it also illustrates the company’s still inferior position compared to Nvidia, according to Dylan Patel, an expert at SemiAnalysis, a firm specializing in chip research.

“The devil is in the details: it is easy to see who had the upper hand during the negotiations, and it was not AMD. When Nvidia and OpenAI signed an agreement, Nvidia obtained shares of OpenAI. Here, it is OpenAI which obtains 160 million shares at a favorable price of 0.01 dollars per unit! It is in fact a disguised rebate, settled in AMD shares. This means that if the latter had sold its chips at full price, OpenAI would probably have stayed with Nvidia.

Multiplication of agreements

The announcement comes as AI and semiconductor players are engaged in a deal frenzy. In mid-September, Nvidia announced it would invest $100 billion in OpenAI over the next decade, while ChatGPT’s parent company committed to building and deploying the equivalent of 10 gigawatts of power in Nvidia chips, which represents a whopping 4 to 5 million graphics cards!

Jensen Huang’s company, however, does not intend to put all its eggs in one basket. The boss of Nvidia has already declared that he will participate in the next fundraising of xAI, through which Elon Musk aims to raise 20 billion dollars. An investment of two billion from Nvidia has been mentioned, an announcement which, once confirmed, will certainly be accompanied by a huge order for graphics processors from xAI. In fact, it spares no effort in gathering computing power and the energy necessary to power it.

For its part, Sam Altman’s company has also joined forces with Broadcom, another American semiconductor player with whom OpenAI has been collaborating for eighteen months to manufacture its own tailor-made chips, dedicated to inference. OpenAI has committed to purchasing an additional 10 gigawatts of computing power from Broadcom, which is expected to represent an investment of $500 billion.

Power is power

It is interesting to note that for each of these agreements, we no longer think first in terms of the number of semiconductors purchased, but in gigawatts, that is to say in terms of the energy power necessary to satisfy the ever more insatiable needs of the AI ​​giants. In other words, they can no longer just talk about computing power: they must think systemically, taking into account data centers and the energy needed to power them. The energy glut of AI indeed constitutes one of the biggest challenges facing the industry, a challenge which forces players to finance energy production capacities.

In such a context, investing in AI infrastructure no longer only involves getting your hands on the most efficient semiconductors (this of course remains a sine qua non condition, reflected in the unbeatable positions occupied by the Taiwanese TSMC and the Dutch ASML), but also excelling in terms of energy generation, energy performance and network integration. This systemic logic was well understood by AMD, as illustrated by the acquisition of server specialist ZD Systems last year.

AI in the age of ecosystems

Because beyond the energy aspect, these cascading agreements also show the extent to which no AI player, however powerful, is capable of dominating the market alone. Each needs the others to establish its domination over the largest possible part of the ecosystem. Without Nvidia’s graphics processors, OpenAI, xAI, Google and others are unable to train and run their large language models. And these are subject to such strong demand that everyone feels the need to ensure their supply by sitting down at the negotiating table with Jensen Huang.

But the AI ​​ecosystem, due to its gigantism, also needs the energy power necessary to run the servers (which explains the investments of big tech in energy infrastructure); companies with the know-how necessary to produce chips on an industrial scale (hence the recent agreement between Mistral and ASML and that between OpenAI and TSMC); rare and precious talents (hence the proliferation of “acqui-hires”); and even the diplomatic force of governments to poach juicy contracts abroad. Recall how Nvidia recently won big contracts in the United Kingdom and the Gulf countries following visits by Donald Trump there. Jensen Huang’s efforts to get the American president in his pocket suddenly make sense…

All these companies are thus both competitors and partners, in a sort of giant oligopolistic market in which everyone knows they cannot win alone, but nevertheless strives to grab the largest possible share of the pie by extending their control over their ecosystem as much as possible. As long as the market continues to grow, this dynamic allows players to fuel their respective growth and access the technical expertise necessary to continue moving forward in an increasingly complex market. But the growing interdependence between the players and the self-fulfilling logic of these agreements (AMD’s stock jumped 25% following the announcement of its partnership with Nvidia) also reinforces the risk of a bubble and the cataclysmic consequences that the bursting of the latter would have in the event of a market slowdown…

Jake Thompson
Jake Thompson
Growing up in Seattle, I've always been intrigued by the ever-evolving digital landscape and its impacts on our world. With a background in computer science and business from MIT, I've spent the last decade working with tech companies and writing about technological advancements. I'm passionate about uncovering how innovation and digitalization are reshaping industries, and I feel privileged to share these insights through MeshedSociety.com.

Leave a Comment