The demand for AI chips is far from saturated: Insights into Nvidia's real situation from Oracle CEO's "plea"

The CEO of Oracle, Larry Ellison, recently revealed a detail at an analyst conference: he dined with Elon Musk and Jen-Hsun Huang, the CEO of NVIDIA, at the Nobu restaurant in Palo Alto, during which they even “begged” Jen-Hsun Huang to sell them more GPUs.

This sounds like a joke, but it reflects a harsh reality behind it—the demand for AI chips from the world's top tech companies is far from being met.

Data Speaks: How Big is the Demand Gap?

Oracle currently operates 85 data centers and has another 77 under construction. However, Ellison's ambition is to eventually operate 2,000 data centers. Next year, Oracle plans to deploy a cluster containing 131,072 GPUs (the largest current cluster is about 32,000), all using NVIDIA's latest Blackwell chips.

What does this mean? The AI inference speed of Nvidia's latest chip is 30 times that of the existing H100.

Digital is more shocking:

  • Oracle signed 42 new GPU capacity contracts in Q1 fiscal quarter, worth $3 billion.
  • Record-high Remaining Performance Obligations (RPO) reached $99 billion, a year-on-year increase of 53%.
  • Cloud infrastructure business revenue of $2.2 billion, a year-on-year increase of 46%

But Oracle Bone Script says it simply cannot supply these orders —— due to the chip shortage.

Not just Oracle Bone Script is starving

Tesla is also in the same predicament. Musk plans to launch 50,000 GPU clusters for autonomous driving AI training by the end of the year, investing $10 billion, but will need to continue expanding thereafter.

Microsoft's fiscal year capital expenditure reached $55.7 billion, primarily for AI infrastructure, and plans to continue increasing it. Amazon's capital expenditure has surpassed the $60 billion mark.

These are not just “testing the waters”; they are desperately pouring money to seize production capacity.

The “opportunity” of NVIDIA's stock price dropping by 14.5%

The market has recently been worried about a cooling of the AI investment craze, with Nvidia's stock price dropping 14.5% from its historical high. However, judging by these companies' frantic rush to acquire GPUs, this concern seems a bit premature.

NVIDIA's data center revenue for the last fiscal quarter was $26.3 billion, a year-on-year increase of 154%. Although the growth rate has slowed compared to earlier (due to a high base), customers are still accelerating their spending.

From a valuation perspective:

  • Current P/E ratio is 52.7 times (which is indeed expensive compared to the 30.9 times of the Nasdaq 100 index)
  • However, based on the expected earnings per share of $4.02 for the fiscal year 2026, the forward P/E is only 28.8 times.

In other words, if you can hold for at least 18 months, the current price might be a good entry point.

When will it cool down

There will definitely be a ceiling. The current scale of AI spending is difficult to sustain in the long term, and competition in the GPU market is also heating up. However, according to existing data, this “ceiling day” is still far off — Oracle and Tesla are still frantically grabbing stock, while Microsoft and Amazon continue to ramp up their investments.

The Ellisons are still begging NVIDIA to sell more chips. This speaks more to the real market demand than any bearish remarks.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin