Nvidia, which makes microchips that power most artificial intelligence applications, began an extraordinary run a year ago.
Buoyed by an explosion of interest in ai, the Silicon Valley company said last May that it expected its chip sales to soar. They did, and the fervor didn't stop, with Nvidia raising its revenue projections every few months. Its shares soared, taking the company to a market capitalization of more than $2 trillion, making it more valuable than Alphabet, Google's parent company.
On Wednesday, Nvidia reported again increased revenue and profits That underscored how it remains a dominant winner from the ai boom, even as it grapples with outsize expectations and growing competition.
Revenue was $26 billion for the three months ending in April, beating its estimate of $24 billion in February and tripling sales from a year earlier for the third consecutive quarter. Net income increased sevenfold to $5.98 billion.
Nvidia also projected revenue of $28 billion for the current quarter, which ends in July, more than double what it was a year ago and higher than Wall Street estimates.
“We are fundamentally changing how computing works and what computers can do,” Nvidia CEO Jensen Huang said on a conference call with analysts. “The next industrial revolution has begun.”
Nvidia shares, which have risen more than 90 percent this year, rose in after-hours trading after the results were released. The company also announced a 10-for-1 stock split.
Nvidia, which originally sold chips for rendering images in video games, has benefited after making an early and expensive bet on adapting its graphics processing units, or GPUs, to take on other computing tasks. When ai researchers began using such chips more than a decade ago to speed up tasks such as recognizing objects in photographs, Huang jumped at the opportunity. He augmented Nvidia chips for artificial intelligence tasks and developed software to aid advances in this field.
The company's flagship processor, the H100, has enjoyed feverish demand to power ai chatbots like OpenAI's ChatGPT. While most standard high-end processors cost a few thousand dollars, H100s have sold for between $15,000 and $40,000 each, depending on volume and other factors, analysts said.
Colette Kress, Nvidia's chief financial officer, said Wednesday that she had worked in recent months with more than 100 clients who were building new data centers (which Huang calls ai factories) ranging from hundreds to tens of thousands of GPUs, with some reaching 100,000. Tesla, for example, is using 35,000 H100 chips to help train models for autonomous driving, she said.
Nvidia will soon begin shipping a powerful successor to the H100, codenamed Blackwell, which was announced in March. Demand for the new chips already appears to be strong, raising the possibility that some customers will wait for the faster models instead of purchasing the H100. But there were few signs of such a pause in Nvidia's latest results.
Kress said demand for Blackwell was well above supply for the chip and “we expect demand to outstrip supply well into next year.” Huang added that the new chips should be running in data centers by the end of this year and that “we will see a lot of revenue from Blackwell this year.”
The comments may ease fears of a slowdown in Nvidia's momentum.
“Long-term investors' concerns about an 'air bubble' for GPU demand appear to have disappeared,” Lucas Keh, an analyst at research firm Third Bridge, said in an email.
Wall Street analysts are also looking for signs that some richly funded rivals could take a significant chunk of Nvidia's business. Microsoft, Meta, Google and amazon have all developed their own chips that can be tailored for ai work, although they have also said they are boosting purchases of Nvidia chips.
Traditional rivals such as Advanced Micro Devices and Intel have also made optimistic predictions about their ai chips. AMD has said it expects to sell a new ai processor, the MI300, worth $4 billion this year.
Huang frequently points out what he has said is a sustainable advantage: only Nvidia GPUs are offered by all major cloud services, such as amazon Web Services and Microsoft Azure, so customers don't have to worry about being stuck. in the use of one. of services thanks to its exclusive chip technology.
Nvidia also remains popular among computer manufacturers who have long used its chips in their systems. One is Dell Technologies, which on Monday hosted an event in Las Vegas in which Mr. Huang participated.
Michael Dell, Dell's chief executive and founder, said his company would offer new data center systems that would pack 72 of the new Blackwell chips into a computer rack, standard structures that are a little taller than a refrigerator.
“Don't seduce me with words like that,” Mr. Huang joked. “That makes me very excited.”