Microsoft technology chief says supply of Nvidia AI chips is improving

Celebrity Gig


Nilay Patel, left, editor in chief of The Verge, and Kevin Scott, Microsoft’s chief technology officer, speak at Vox Media’s Code Conference at The Ritz-Carlton Laguna Niguel in Dana Point, California, on Sept. 27, 2023.

Jerod Harris | Vox Media | Getty Images

Microsoft technology chief Kevin Scott said on Wednesday that the company is having an easier time getting access to Nvidia‘s chips that run artificial intelligence workloads than it was a few months ago.

Speaking on stage at the Code Conference in Dana Point, California, Scott said the market for Nvidia’s graphics processing units is opening up a little. The GPUs have been heavily in demand since Microsoft-backed OpenAI launched the ChatGPT chatbot late last year.

“Demand was far exceeding the supply of GPU capacity that the whole ecosystem could produce,” Scott told The Verge’s Nilay Patel. “That is resolving. It’s still tight, but it’s getting better every week, and we’ve got more good news ahead of us than bad on that front, which is great.”

READ ALSO:  Westinghouse announces a small nuclear reactor

Microsoft, similar to Google and other tech companies, has been quickly adding generative AI to its own products and selling the technology’s capabilities to clients. Training and deploying the underlying AI models has mainly relied on Nvidia’s GPUs, creating scarcity of supply.

Nvidia said last month that it expects revenue growth this quarter of 170% from a year earlier. The company has such control of the AI chip market that its gross margin shot up from 44% to 70% in a year. Nvidia’s stock price is up 190% in 2023, far outpacing every other member of the S&P 500.

In an interview with Patel that was published in May, Scott said one of his responsibilities is controlling the GPU budget across Microsoft. He called it “a terrible job” that has been “miserable for five years now.”

READ ALSO:  Meta's Threads to have search and web functions, Zuckerberg says

“It’s easier now than when we talked last time,” Scott said Wednesday. At that time, generative AI technologies were still new and attracting broad attention from the public, he said.

The increased supply “makes my job of adjudicating these very gnarly conflicts less terrible,” he said.

Nvidia expects to increase supply each quarter through next year, finance chief Colette Kress told analysts on last month’s earnings call.

Traffic to ChatGPT has declined month over month for three consecutive months, Similarweb said in a blog post. Microsoft provides Azure cloud-computing services to OpenAI. Meanwhile, Microsoft is planning to start selling access to its Microsoft 365 Copilot to large organizations with subscriptions to its productivity software in November.

READ ALSO:  Amazon is pursuing 'too many ideas,' Bernstein says in open letter

Scott declined to address the accuracy of media reports regarding Microsoft’s development of a custom AI chip, but he did highlight the company’s in-house silicon work. Microsoft has previously worked with Qualcomm on an Arm-based chip for Surface PCs.

“I’m not confirming anything, but I will say that we’ve got a pretty substantial silicon investment that we’ve had for years,” Scott said. “And the thing that we will do is we’ll make sure that we’re making the best choices for how we build these systems, using whatever options we have available. And the best option that’s been available during the last handful of years has been Nvidia.”

WATCH: Amazon’s A.I. investment is a page out of the OpenAI-Microsoft playbook: Deepwater’s Doug Clinton

Amazon's A.I. investment is a page out of the OpenAI-Microsoft playbook: Deepwater's Doug Clinton

Categories

Share This Article
Leave a comment