Nvidia CEO Jensen Huang on Wednesday dismissed concern about an end to a spending boom on artificial intelligence chips, projecting opportunities will expand into a multi-trillion-dollar market over the next five years.
Huang sought to reassure investors rattled by indications of slowing growth at the chipmaker at the centre of the investment frenzy. Nvidia earlier in the day forecast third-quarter revenue meeting analyst estimates but short of the lofty expectations that have sent its share price up roughly one-third this year.
The founder and CEO's bullish outlook contrasts with recent signs of fatigue in AI-focused stocks and comments from industry leaders about overheated investor enthusiasm.
"A new industrial revolution has started. The AI race is on," Huang said. "We see $3 trillion to $4 trillion in AI infrastructure spend by the end of the decade."
Pushing up the chipmaker's shares are expectations of demand from Big Tech, data centre owners known as hyperscalers and China.
"The mega caps are the ones propelling a lot of the capex that Nvidia is benefiting from. But obviously Nvidia still is growing, is able to sell," said Matt Orton, head of advisory solutions at Raymond James Investment Management.
"If anything, this just highlights that there's a lot of durability to this (AI) trade... The businesses of these hyperscalers can continue to accelerate, and you're not seeing any sort of sign of a slowdown being reflected in the results of Nvidia."
While Nvidia shares have outpaced a roughly 10% gain in the broader market, AI-facing stocks have shown signs of fatigue. OpenAI CEO Sam Altman set off alarm bells this month when he said investors may be "overexcited" about AI.
On Wednesday, Huang sounded unperturbed.
"The more you buy, the more you grow," Huang said, arguing that Nvidia's technological advances allow customers to process increasing amounts of data while using less energy. "The buzz is: everything sold out."
Case in point: A customer outside China bought $650 million worth of Nvidia's H20 reduced-capability chip aimed at the Chinese market in the latest quarter, the chipmaker said.
Huang based his forecast in part on the $600 billion he expects for data centre capital spending this year from major customers such as Microsoft and Amazon.
For a data centre costing as much as $60 billion, Nvidia can capture about $35 billion, Huang said.
Huang's remarks contrast with a tepid third-quarter sales forecast of about $54 billion, slightly ahead of the $53.14 billion average of analyst estimates compiled by LSEG.
Nvidia and Huang, however, see little reason for AI chip profit growth to slow as second-quarter net income surpassed the fiscal third-quarter profit of Big Tech peer Apple.
The company's high-end Blackwell chips are largely spoken for based on 2026 forecasts from its biggest customers. Its earlier-generation Hopper processors are being snapped up too.
"When you have something that is new, and it's growing as fast as it is, and with all of the huge capex announcements from the hyperscalers, it's evidence that we're in the early stages" of the AI boom, said Globalt Investments portfolio manager Thomas Martin.
Huang sought to reassure investors rattled by indications of slowing growth at the chipmaker at the centre of the investment frenzy. Nvidia earlier in the day forecast third-quarter revenue meeting analyst estimates but short of the lofty expectations that have sent its share price up roughly one-third this year.
The founder and CEO's bullish outlook contrasts with recent signs of fatigue in AI-focused stocks and comments from industry leaders about overheated investor enthusiasm.
"A new industrial revolution has started. The AI race is on," Huang said. "We see $3 trillion to $4 trillion in AI infrastructure spend by the end of the decade."
Pushing up the chipmaker's shares are expectations of demand from Big Tech, data centre owners known as hyperscalers and China.
"The mega caps are the ones propelling a lot of the capex that Nvidia is benefiting from. But obviously Nvidia still is growing, is able to sell," said Matt Orton, head of advisory solutions at Raymond James Investment Management.
"If anything, this just highlights that there's a lot of durability to this (AI) trade... The businesses of these hyperscalers can continue to accelerate, and you're not seeing any sort of sign of a slowdown being reflected in the results of Nvidia."
While Nvidia shares have outpaced a roughly 10% gain in the broader market, AI-facing stocks have shown signs of fatigue. OpenAI CEO Sam Altman set off alarm bells this month when he said investors may be "overexcited" about AI.
On Wednesday, Huang sounded unperturbed.
"The more you buy, the more you grow," Huang said, arguing that Nvidia's technological advances allow customers to process increasing amounts of data while using less energy. "The buzz is: everything sold out."
Case in point: A customer outside China bought $650 million worth of Nvidia's H20 reduced-capability chip aimed at the Chinese market in the latest quarter, the chipmaker said.
Huang based his forecast in part on the $600 billion he expects for data centre capital spending this year from major customers such as Microsoft and Amazon.
For a data centre costing as much as $60 billion, Nvidia can capture about $35 billion, Huang said.
Huang's remarks contrast with a tepid third-quarter sales forecast of about $54 billion, slightly ahead of the $53.14 billion average of analyst estimates compiled by LSEG.
Nvidia and Huang, however, see little reason for AI chip profit growth to slow as second-quarter net income surpassed the fiscal third-quarter profit of Big Tech peer Apple.
The company's high-end Blackwell chips are largely spoken for based on 2026 forecasts from its biggest customers. Its earlier-generation Hopper processors are being snapped up too.
"When you have something that is new, and it's growing as fast as it is, and with all of the huge capex announcements from the hyperscalers, it's evidence that we're in the early stages" of the AI boom, said Globalt Investments portfolio manager Thomas Martin.