During the previous week, Microsoft unveiled one of the most significant unveilings in the chip market in a considerable period of time. Microsoft MSFT currently possesses a custom-designed AI processing chip and a custom Arm-based CPU to add to its increasing range of products to aid in the vertical integration of its services and solutions.Thank you for reading this post, don't forget to subscribe!
The standout feature is the recent Azure Maia 100 AI accelerator chip. This chip is developed for both training and inference (forming models and subsequently using them for tasks) and has been collaborated with OpenAI, the AI titan behind ChatGPIT. The Maia 100 is not a traditional graphics processing unit (GPU) as seen from Nvidia NVDA or AMD, but rather a custom design specifically for AI processing, which theoretically should provide the chip with an increase in performance or efficiency, or both.
Microsoft’s Maya 100 will be aimed at usage on workloads such as Bing, various Copilot branded features, ChatGPT, and other generative AI systems. Microsoft indicated that it is specifically crafted for large language models, the type of AI that fuels ChatGPT, but did not specify whether this would result in limitations in other areas of AI computing. As in the case of nearly all custom designed chips, specific targeting to a particular workload can yield significant gains on that workload, but at the cost of more generalized computing.
Custom AI accelerators and CPUs mean Microsoft can now fully vertically integrate its data center and entirely dominate its future if it desires. By possessing the silicon, servers, software, and services that run on it, Microsoft exercises control over every step between it and the customer. This is the strategy Apple has adopted on consumer products, ownership of the silicon design until final sale to the end user, and everything in between. The result is best in class user experience and real market differentiation.
Deeper considerations exist on the technology of these new chips and the server infrastructure the company is constructing for them, but I would like to observe how these announcements will affect other directly rivaling chip companies and the implications of the MFST custom silicon path in general.
pressure on ai chip rivals
The foremost name that emerges in connection with the impact on the industry due to the release of the Azure Maia 100 chip is Nvidia. No other company has benefited from the expansion in AI and the computational power needed for AI. Nvidia’s data center group generated revenue exceeding $10 billion in the second quarter, encompassing sales of its AI chips including Hopper and Grace, a surge of 141% from the prior quarter. Nvidia leads in market share in GPUs for AI training and inference, within the enterprise and cloud. And Microsoft has been one of, if not the, largest purchasers of Nvidia hardware.
Thus, while it seems that Nvidia will experience a short-term setback in orders from Microsoft for its GPUs, there is adequate demand from others in the market to compensate for this. Nvidia has been vending its chips months in advance, with numerous reports indicating that 2024 was already sold out. Due to mea culpa, any chip available will quickly be taken up by others.
Following in the race for the GPU and AI chip supremacy is AMD. The company holds the second position, but Microsoft has officially declared the adoption of AMD’s MI300X chips for some new Azure instances. AMD CEO Lisa Su is optimistic about this chip revenue reaching $1 billion rapidly, and it seems that Microsoft might be responsible for the majority of it.
Microsoft and AMD boast a very strong working relationship. The two entities have collaborated on custom silicon for the previous generation of Xbox gaming consoles and Surface PC, and Azure is a major consumer of AMD’s EPYC data center CPUs. This partnership will possibly lead to aspects such as MI300X AI chip integration and could support AMD in securing a broader presence in Microsoft’s AI strategy.
Meanwhile, Intel INTC is grappling to regain relevance in the AI hardware race. Efforts to develop a GPU to compete with Nvidia and AMD have met with only modest success in the data center arena, with Intel entering the data center GPU fray belatedly.
This will be an uphill battle for Intel. Picturing Microsoft, Alphabet’s Google GOOGL, Amazon.com AMZN is an expansion
And larger infrastructure players are moving to a third or fourth source like Intel for GPUs.
The market for AI hardware is expanding significantly, so even though Nvidia may lose a few percentage points of market share in 2024 and 2025, with the majority transitioning to custom silicon alternatives such as AMD and Azure Maia 100, I anticipate no decline in revenues for the AI powerhouse.
Ripple effect from Microsoft’s move
Microsoft’s continuation on the custom silicon path, with both the Maia 100 AI accelerator and the Cobalt 100 Arm-based CPUs, has other intriguing implications. Primarily, as I wrote last week, Arm Holdings will persist in demonstrating to the ARM market that it will continue to expand and broaden the footprint of its architecture. Nvidia and now Microsoft have staked on their own custom silicon CPUs to facilitate the operation of energy-efficient data centers and diverse workloads.
The last consideration worth mentioning about these new custom chips is tied to production. Taiwan Semiconductor Manufacturing 2330 TSM produces essentially all the top processors and GPUs from Nvidia, AMD, Qualcomm Qcom, Apple AAPL, and more.
And now Microsoft’s custom silicon chips (alongside chips from Amazon and Google). The distribution of that limited resource has been a principal motivator in dominating the AI race – the entity possessing the most silicon wafers has the chips to vend on the market.
If Microsoft finds it challenging to compete with orders from Nvidia and Apple, will it be able to scale? Or, is this an opening for Intel’s foundry services segment to truly profit from this market, even if its AI product portfolio confronts challenges? It seems that ample opportunities exist if Intel can reorganize its foundry operation as quickly as possible.
Ryan Shrout is Founder and Principal Analyst Shrout Research, follow him on x @ryanshrout, Shrout has provided consulting services for AMD, Qualcomm, Intel, Arm Holdings, Micron Technology, Nvidia, and others. Shrout owns shares of Intel.
Also read: Nvidia pushes to stay ahead of Intel, AMD in high-stakes, high-performance computing race
More: These tech stocks performed well this earnings season despite the AI hype slowing down — and the winner may surprise you