The AI ‘Equipment’ Play

By Chris Lillard

When Microsoft Corp. (MSFT) reported earnings this past week, everyone was looking for two things:

  1. How much cloud revenue was slowing down
  2. Updates on artificial intelligence (AI)

While the company reported cloud revenue growth of 27% — less than the 31% reported last quarter — that was still better than expectations of 26.5% growth.

With the “all clear” siren sounded on that news, the focus was then on the exciting AI space.

Since launching in January, Microsoft’s Azure OpenAI service, a cloud platform that hosts AI applications, has picked up 2,500 customers. Microsoft has also been integrating AI into its search engine, Bing, and says it now has over 100 million daily users and downloads.

That’s still a far cry from Google’s search engine and its more than 1 billion daily active users, but integrating AI into Bing has appeared to drum up much more interest.

CEO Satya Nadella said he will also share how Microsoft is “building the most powerful AI platform for developers” at the company’s Build conference near the end of May.

The Microsoft stock price shot up after earnings were released, climbing more than 7% in a day.

Before the earnings announcement, our tools and systems showed Microsoft was firing on all cylinders. Our Health Indicator rates MSFT as a healthy investment if you’re considering buying shares.

But Microsoft’s push into the competitive AI space also brings to light another opportunity to consider: Nvidia Corp. (NVDA).

For a chatbot to answer a question or draw a picture in a few seconds, it needs a powerful chip like Nvidia’s $10,000 A100 — and the companies behind these AI technologies need hundreds or thousands of chips.

Nvidia also sells the DGX A100, which is eight A100 GPUs working together.

The DGX A100 has a suggested price of $200,000, and research firm New Street Research estimates that the ChatGPT model inside Bing could need eight GPUs to answer a question in less than a second.

In order for Bing to provide this service to all of its users, it would require more than 20,000 8-GPU servers, which could cost Microsoft $4 billion.

The investment bank Citigroup projects that ChatGPT usage alone could bring Nvidia $3 billion to $11 billion in sales by the end of 2023.

With interest in AI as high as it is right now, it’s no wonder that proprietary data from our friends at LikeFolio shows that Purchase Intent mentions — how many people have mentioned buying or thinking about buying a product or service from Nvidia — are 10% higher year-over-year (YoY):

Taken together, these data points signal massive growth potential for Nvidia. The chip maker is expected to announce earnings on May 24, and we’ll see how the appetite for AI innovation has impacted the company’s balance sheet so far.