20.1 C
Indore
Monday, December 23, 2024
Home AI News Microsoft acquires twice as many Nvidia AI chips as tech rivals

Microsoft acquires twice as many Nvidia AI chips as tech rivals


Microsoft purchased twice as lots of Nvidia’s flagship chips as any of its largest rivals within the US and China this yr, as OpenAI’s largest investor accelerated its funding in synthetic intelligence infrastructure.

Analysts at Omdia, a expertise consultancy, estimate that Microsoft purchased 485,000 of Nvidia’s “Hopper” chips this yr. That put Microsoft far forward of Nvidia’s subsequent largest US buyer Meta, which purchased 224,000 Hopper chips, in addition to its cloud computing rivals Amazon and Google.

With demand outstripping provide of Nvidia’s most superior graphics processing items for a lot of the previous two years, Microsoft’s chip hoard has given it an edge within the race to construct the following technology of AI programs.

This yr, Large Tech corporations have spent tens of billions of {dollars} on information centres working Nvidia’s newest chips, which have grow to be the most well liked commodity in Silicon Valley for the reason that debut of ChatGPT two years in the past kick-started an unprecedented surge of funding in AI.

Microsoft’s Azure cloud infrastructure was used to coach OpenAI’s latest o1 mannequin, as they race in opposition to a resurgent Google, start-ups akin to Anthropic and Elon Musk’s xAI, and rivals in China for dominance of the following technology of computing.

Omdia estimates ByteDance and Tencent every ordered about 230,000 of Nvidia’s chips this yr, together with the H20 mannequin, a much less highly effective model of Hopper that was modified to fulfill US export controls for Chinese language clients.

Amazon and Google, which together with Meta are stepping up deployment of their very own customized AI chips as an alternative choice to Nvidia’s, purchased 196,000 and 169,000 Hopper chips respectively, the analysts mentioned.

Omdia analyses corporations’ publicly disclosed capital spending, server shipments and provide chain intelligence to calculate its estimates.

Bar chart of 2024 shipments of Nvidia Hopper GPUs ('000) showing Microsoft's spending on Nvidia's AI chips has far outstripped rivals'

The worth of Nvidia, which is now beginning to roll out Hopper’s successor Blackwell, has soared to greater than $3tn this yr as Large Tech corporations rush to assemble more and more massive clusters of its GPUs.

Nevertheless, the inventory’s extraordinary surge has waned in latest months amid issues about slower growth, competitors from Large Tech corporations’ personal customized AI chips and potential disruption to its enterprise in China from Donald Trump’s incoming administration within the US.

ByteDance and Tencent have emerged as two of Nvidia’s largest clients this yr, regardless of US authorities restrictions on the capabilities of American AI chips that may be bought in China.

Microsoft, which has invested $13bn in OpenAI, has been essentially the most aggressive of the US Large Tech corporations in constructing out information centre infrastructure, each to run its personal AI companies akin to its Copilot assistant and to hire out to clients by its Azure division.

Microsoft’s Nvidia chip orders are greater than triple the variety of the identical technology of Nvidia’s AI processors that it bought in 2023, when Nvidia was racing to scale up production of Hopper following ChatGPT’s breakout success.

“Good information centre infrastructure, they’re very advanced, capital intensive initiatives,” Alistair Speirs, Microsoft’s senior director of Azure World Infrastructure, advised the Monetary Instances. “They take multi-years of planning. And so forecasting the place our development will likely be with slightly little bit of buffer is essential.”

Tech corporations around the globe will spend an estimated $229bn on servers in 2024, in response to Omdia, led by Microsoft’s $31bn in capital expenditure and Amazon’s $26bn. The highest 10 consumers of information centre infrastructure — which now embrace relative newcomers xAI and CoreWeave — make up 60 per cent of world funding in computing energy.

Vlad Galabov, director of cloud and information centre analysis at Omdia, mentioned some 43 per cent of spending on servers went to Nvidia in 2024.

“Nvidia GPUs claimed a tremendously excessive share of the server capex,” he mentioned. “We’re near the height.”

Bar chart of Capital expenditure on servers, 2024 ($bn) showing Big Tech's biggest spenders in AI data centre boom

Whereas Nvidia nonetheless dominates the AI chip market, its Silicon Valley rival AMD has been making inroads. Meta purchased 173,000 of AMD’s MI300 chips this yr, whereas Microsoft purchased 96,000, in response to Omdia.

Large Tech corporations have additionally stepped up utilization of their very own AI chips this yr, as they attempt to scale back their reliance on Nvidia. Google, which has for a decade been growing its “tensor processing items”, or TPUs, and Meta, which debuted the primary technology of its Meta Coaching and Inference Accelerator chip final yr, every deployed about 1.5mn of their very own chips.

Amazon, which is investing closely in its Trainium and Inferentia chips for cloud computing clients, deployed about 1.3mn of these chips this yr. Amazon mentioned this month that it plans to construct a brand new cluster utilizing lots of of hundreds of its newest Trainium chips for Anthropic, an OpenAI rival through which Amazon has invested $8bn, to coach the following technology of its AI fashions.

Microsoft, nonetheless, is much earlier in its effort to construct an AI accelerator to rival Nvidia’s, with solely about 200,000 of its Maia chips put in this yr.

Speirs mentioned that utilizing Nvidia’s chips nonetheless required Microsoft to make vital investments in its personal expertise to supply a “distinctive” service to clients. 

“To construct the AI infrastructure, in our expertise, isn’t just about having the very best chip, it’s additionally about having the fitting storage parts, the fitting infrastructure, the fitting software program layer, the fitting host administration layer, error correction and all these different parts to construct that system,” he mentioned.



Source link

Most Popular

Recent Comments