Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Just caught Jensen Huang dropping some pretty wild numbers on the latest earnings call, and I think people are underestimating what this means for the next few years.
So Nvidia's about to ship the Vera Rubin platform starting in the second half of this year, and the specs are honestly insane. We're talking about training AI models with 75% fewer GPUs compared to Blackwell, plus slashing inference token costs by 90%. For context, tokens are basically pieces of data that AI models generate, and every token costs money to produce. When you cut costs that dramatically, you're looking at a massive unlock for AI companies to scale usage and margins.
But here's the really interesting part. During the earnings call, Jensen Huang made this comment that stuck with me. He said the world spent around $400 billion per year on classical computing infrastructure historically, but the capacity required for AI workloads is a thousand times higher. That's not a typo. A thousand times. He's also been saying AI data center infrastructure spending could hit $4 trillion annually by 2030.
Now, that sounds ambitious, but think about what's actually happening. Nvidia just posted $215.9 billion in revenue for fiscal 2026, up 65% year-over-year. Data center revenue alone was $193.7 billion, growing 68%. They're forecasting Q1 FY2027 revenue at $78 billion, which would be a 77% jump. And they're basically saying most of that growth is coming from data centers.
What's wild is that Nvidia is mostly competing with itself right now, not its competitors. Demand is still outpacing supply. The Vera Rubin platform is going to be another step function improvement, and Jensen Huang seems pretty convinced the infrastructure spending isn't slowing down anytime soon.
From a valuation angle, the stock's trading at a P/E of 36.1 right now, which is actually 41% below its 10-year average of 61.6. Wall Street's expecting earnings to grow to $8.23 in fiscal 2027, which would put the forward P/E at just 21.5. For comparison, the S&P 500 is trading at a trailing P/E of 24.7. So even if Nvidia doesn't move much higher over the next year, it could actually be cheaper than the broad market.
If Wall Street's earnings estimate hits, the stock would need to roughly double just to trade at its historical average valuation. That's before factoring in any upside from accelerating AI adoption as inference costs keep dropping. Definitely something worth watching as we head into the back half of the year.