Microsoft’s $1 Billion Investment in AI Startup Is Good News for Nvidia
Even for a tech giant that has been making big AI-related investments for years, Microsoft’s (MSFT) deal to pour $1 billion into an AI startup is a little stunning.
On Monday morning, Microsoft announced it’s investing that large sum in OpenAI, an AI/deep learning research firm that’s backed by Elon Musk, Peter Thiel and several other high-profile investors, at an undisclosed valuation. As part of the deal, OpenAI will port its services to run on Microsoft’s Azure cloud platform, and also leverage Azure to support R&D work related to bringing artificial general intelligence (AGI) — the ability of machines to fully reason and learn tasks the way that human beings are able to — a little closer to reality.
The deal also calls for Microsoft to become OpenAI’s “preferred partner for commercializing new AI technologies,” and for the companies to “jointly build new Azure AI supercomputing technologies.” They add that a computing platform of “unprecedented scale” will be built on Azure to train and run advanced AI models.
Notably, OpenAI, which started out as a non-profit but eventually became a capped-profit company, suggests much of Microsoft’s investment will go towards AI computing resources hosted on Azure. And it’s quite likely that these resources will include a lot of Nvidia (NVDA) server GPUs.
Though competition has started to pick up a little, Nvidia, aided by a vast software ecosystem, remains the dominant player in the market for accelerators used to handle the demanding task of training AI models to do things such as translate text, identify objects within photos and videos, understand voice commands and detect signs of disease in CT and MRI scans. The company’s flagship Tesla V100 server GPU has been widely deployed for AI training by cloud giants, and has also been used by OpenAI to train neural networks that can perform tasks such as creating music and competing against professional gamers.
Meanwhile, less powerful Nvidia server GPUs are increasingly used for inference — the running of trained AI models and real-world data and content. While a lot of inference work is still handled by Intel (INTC) server CPUs, a steadily-growing portion is handled by accelerators such as GPUs, programmable chips (FPGAs) and custom-designed chips (ASICs). Microsoft, for its part, has been using both Nvidia GPUs, and FPGAs from Intel and Xilinx (XLNX) , for inference.
In addition to benefiting Nvidia’s top line, the Microsoft/OpenAI deal highlights — at a time when Nvidia’s Datacenter segment sales are pressured by a cloud capital spending pause — how much long-term demand there still is for more AI training computing power, as datasets grow, models become more complex and new AI applications are pursued.
And certainly, efforts to bring AGI, which attempts to overcome the many limitations that today’s deep learning systems still have when it comes to performing human-like reasoning and task-learning, to fruition won’t do anything to reduce this demand. While discussing Microsoft’s investment, OpenAI says it wants AGI to work with humans to “work with people to solve currently intractable multi-disciplinary problems, including global challenges such as climate change, affordable and high-quality healthcare, and personalized education.” If other cloud giants respond to the OpenAI deal by stepping up their own AGI R&D work, that benefits Nvidia as well.
From Microsoft’s standpoint, making OpenAI’s work available to Azure clients, and increasing the AI-related computing power Azure provides, strengthens its hand in a hotly competitive field. Microsoft, Amazon.com (AMZN) and Alphabet/Google (GOOGL) have all been investing heavily in both rolling out cloud services that let clients quickly train and deploy new AI models, and in launching programming interfaces (APIs) that let developers leverage the cloud giants’ own R&D work in areas such as computer vision, speech analysis and natural language-processing. Google, which has made massive AI investments for its consumer services and in 2014 bought AI research startup DeepMind, has been seen as particularly strong in this area.
Microsoft’s investment also fits with its broader efforts to sell developers on adopting Azure — efforts that include spending $7.5 billion last year to buy popular code-hosting and sharing platform GitHub. The company clearly isn’t shy right now about opening its checkbook to help win over the types of developers who in the past may have been wary of it.
And when it comes to Microsoft’s latest such move, Nvidia, a company that knows a thing or two itself about going the extra mile to win over developers, is pretty unlikely to complain.