ContentSproute

Nvidia drives the AI economy. It just hit a new speed limit thumbnail

Nvidia drives the AI economy. It just hit a new speed limit

Every frenzy has its moment — a point where record-breaking stops feeling record-breaking and jaw-dropping starts to sound like last quarter’s news. On Wednesday, the AI frenzy had one of those moments. 

Nvidia posted a huge quarter — $46.7 billion in revenue, $26.4 billion in profit, gross margins back above 72%, and guidance for an even fatter $54 billion in the current quarter — and Wall Street basically said: OK, and? Shares slipped about 1.5% through Thursday morning, as if the market had already grown a little bit bored with history.

The dissonance says everything. The AI economy is still expanding at a breakneck pace, but it’s colliding with the law of diminishing returns: Each additional dollar of capex buys growth but less thrill and wonder. The spectacle is giving way to the grind — a phase where power grids, licensing regimes, and productivity metrics matter as much as GPUs. Nvidia’s print wasn’t just a corporate milestone. It was a mirror for an industry learning that revolutions can also plateau.

Nvidia’s earnings report is just about the cleanest mirror you could hold up to the industry right now. It shows breathtaking demand, yes, and it also shows that speed limits are creeping in. Data-center revenue — the heart of the company’s AI story — hit $41.1 billion, up 56% from a year ago, massive by any measure, yet was treated like routine. The company lifted its outlook to $54 billion and pointedly left China out. The market’s response wasn’t disappointment in the business; it was fatigue with the physics, the politics, and the tab.

Morgan Stanley analyst Joe Moore called the quarter “a clean beat and raise” in a Thursday note, noting that guidance of $54 billion excluding China topped his $52.5 billion estimate. But his conclusion cut to the bone: “Sentiment has largely caught up to the growth potential.” Nvidia cleared the bar, but the bar is now moving faster than the numbers. And at its altitude — trading at roughly high-20s to high-30s times forward earnings, depending on the estimate — even record-shattering results can look more like table stakes than a windfall.

The law of less and more (and Moore)

Chief financial officer Colette Kress framed the company’s scale in industrial terms: Production is now running at about 1,000 Blackwell racks a week, or nearly 864,000 GPUs a quarter once they’re fully flowing. She dubbed them “AI factories,” each designed to crank out tokens per watt the way a steel mill cranks out tons per hour. The efficiency pitch is seductive. Kress told investors that a $3 million outlay on Nvidia hardware can yield $30 million in token revenue — a 10x return.

The ROI optics are dazzling. But at scale, those same economics look like a treadmill — falling inference prices invite more usage, which drives more spend, which lowers costs again. “A steady diet of beat and raise quarters should be enough for the stock to work at ~27x EPS,” Morgan Stanley’s Moore said. But he also admitted investors “don’t have a wall of worry to climb” anymore. In other words, the marginal thrill is harder to deliver, and the economics are looping in on themselves. Inference costs are collapsing, prompting more usage, which drives more spend, which lowers costs again. It’s Jevons paradox in silicon: Every gain in efficiency drives more demand, which drives more spending.

Add the slowdown of Moore’s Law — chips no longer double their performance on schedule — and the treadmill comes into focus. Customers are sprinting harder just to stay in place. CEO Jensen Huang, when asked about Nvidia’s product cadence, said the company is on an annual product cycle to speed cost and performance gains so customers can better absorb soaring infrastructure and power costs. Each cycle delivers more compute but not necessarily more returns per dollar.

And diminishing returns creep in even amid industrial scale. Hyperscalers are throwing money at GPUs like never before. Microsoft is preparing to spend nearly $30 billion this quarter on data-center capex, Alphabet lifted its 2025 budget to $85 billion, Meta nudged its plan to as much as $72 billion. That ballooning spend is both Nvidia’s lifeblood and its risk factor. The company’s growth is chained directly to hyperscalers’ willingness to keep writing massive checks even as investors grow impatient for a payoff.

For even if there’s investor fatigue, Nvidia isn’t slowing. The company has turned scale into strategy, matching ballooning hyperscaler budgets with the only supply chain that can keep up. That kind of dominance doesn’t vanish in a quarter.

“We’re in every cloud for a good reason,” Huang told analysts on the earnings call. “You’ve heard me say before that in a lot of ways, the more you buy, the more you grow. And because our performance per dollar is so incredible, you also have extremely great margins. So the growth opportunity with Nvidia’s architecture and the gross margins opportunity with Nvidia’s architecture is absolutely the best. And so there are a lot of reasons why Nvidia has been chosen by every cloud and every startup and every computer company.”

The great wall of restrictions

For all of Nvidia’s triumph, the most glaring hole in its print was China. The company recorded zero H20 chip sales to Chinese customers in the second quarter. Kress acknowledged China’s share of data-center revenue had shrunk to “low single digits.” Guidance for the third quarter excludes China entirely. U.S. officials, meanwhile, have, according to Kress, “expressed an expectation that the [government] will receive 15% of the revenue generated from licensed H20 sales,” but that hasn’t been “codified” in regulation.

Huang told investors he’s “working with the administration” to secure licenses, but analysts are treating China as a ghost market or a phantom limb. Morgan Stanley calls it “impossible to forecast.” Jefferies pegs the upside at $2–5 billion in a single quarter if approvals come through. Wedbush went further, telling clients that a reopened China could propel Nvidia to a $5 trillion market cap by early 2026. For now, it’s all phantom math — a $50 billion prize that inflates models without hitting income statements.

Beyond geopolitics, physics is the other governor. After years of flat consumption, U.S. electricity demand is surging toward record highs in 2025 and 2026, with AI data centers a primary culprit. PJM, the country’s largest grid operator, says future capacity prices have jumped tenfold. Globally, the International Energy Agency expects data-center power use to more than double by 2030, equaling Japan’s entire national consumption. 

But Nvidia has built its pitch around efficiency: “Not only are we the most energy-efficient, our perf per watt is the best of any computing platform,” Huang said on the call. “And in a world of power-limited data centers, perf per watt drives directly to revenues.”

Packaging is another bottleneck. Advanced CoWoS capacity at TSMC is being doubled this year, but demand still outstrips supply. William Blair noted that Nvidia’s Blackwell Ultra already delivered over $10 billion in second-quarter revenue — faster than expected — precisely because it cornered packaging slots early. Memory isn’t faring better. SK Hynix and Micron have said their high-bandwidth DRAM (HBM) supply is sold out well into 2025. Without HBM, racks of GPUs are stranded assets.

Meanwhile, networking — once an afterthought — has become Nvidia’s unlikely breakout star. Revenue from interconnects surged 46% last quarter to $7.3 billion. On the earnings call, Huang waxed poetic about Nvidia’s three-layer approach: “scale up” with NVLink, “scale out” with InfiniBand, “scale across” with Spectrum-XGS. Networking is now the glue that turns racks into supercomputers and clusters into “AI factories.” It’s also the latest reminder that bottlenecks don’t vanish — they just migrate.

Nvidia’s pivot is to sell the system, not just the chips. Spectrum-X Ethernet, which Kress said is now “an annualized revenue exceeding $10 billion,” is the connective tissue. Sovereign AI has doubled year over year, now on track for over $20 billion in 2025. RTX Pro servers have nearly 90 early adopters, from Disney to Hyundai to Eli Lilly. Nvidia isn’t just selling GPUs; it’s selling the factory.

The AI boom’s second act

If Nvidia’s quarter proved anything, it’s that the AI boom hasn’t cooled, but it has matured. The AI boom’s first act was about audacity: hyperscalers panic-buying GPUs, model launches every other week, valuations ripping higher. Then, maybe, the second act is about logistics: power contracts, packaging slots, sovereign customers, and early signs of productivity.

Wall Street still loves Nvidia’s story — and for good reason. The company remains the crown jewel of the chip trade. Citi, JPMorgan, Oppenheimer, BNP Paribas, and Morgan Stanley all raised their price targets after the earnings report, clustering around $210–$225. Morgan Stanley’s Moore still calls Nvidia his “most preferred name in large cap AI.” William Blair highlighted traction beyond hyperscalers. Jefferies named Nvidia its “Top Pick,” stressing that demand remains “rock solid.” And Citi analysts argued Nvidia’s dominance in data-center AI “is not only intact but expanding,” underscoring why most of Wall Street still leans bullish on the company.

So why does history-making growth feel so… ordinary?

The irony of diminishing returns is that Nvidia is still hitting numbers that would make any other company blush. Gross margins are climbing. Product roadmaps are on time. Blackwell racks are shipping at an industrial pace. Networking is a profit center. The company just authorized another $60 billion in buybacks. This isn’t a collapse. It’s a normalization — the shift from a frenzy that shocked the market to a business that has to satisfy it. Growth is still staggering, but each additional record buys less awe.

Nvidia remains the metronome of the boom. It’s still the only company that can sell out entire generations of chips before they ship. It’s still the name that moves the Nasdaq. It’s still the company that analysts pile superlatives on. It’s still the must-own name whose CEO and CFO can frame servers as “factories of intelligence” and be taken literally. But Wednesday’s report showed the beginnings of a new reality: Diminishing returns aren’t a hypothetical — they’re visible in the market’s reaction. The AI revolution hasn’t slowed; it’s entered the part of the curve where physics, politics, and investor patience set the tempo. And that’s when the thrill of “more” starts to sound a lot like “enough.”

📬 Sign up for the Daily Brief

Read More

Scroll to Top