Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Nvidia Posts 75.2% Margin as $650 Billion AI Capex Looms
Nvidia Posts 75.2% Margin as $650 Billion AI Capex Looms
Khac Phu Nguyen
Fri, February 27, 2026 at 3:32 AM GMT+9 2 min read
In this article:
NVDA
-5.03%
This article first appeared on GuruFocus.
Nvidia (NASDAQ:NVDA) entered this earnings release with one major question already answered by its own customers: demand into 2026 does not appear to be the issue. Hyperscale AI companies are collectively projecting roughly $650 billion in capital expenditures this year, up about 60% from 2025, and Nvidia is positioned to capture a substantial portion of that outlay. With that backdrop largely pre-telegraphed, the company needed a different lever to impress investors. It delivered on margins. Adjusted gross margin reached 75.2% in the November-January period, the highest level since the second half of 2024, and management guided to roughly similar levels in the current quarter. The debate now centers on how long that profitability profile can hold as the AI ecosystem matures.
Supply and input costs remain part of the equation. Chief Financial Officer Colette Kress said Nvidia has strategically secured inventory and capacity to meet demand beyond the next several quarters, though she acknowledged that supply tightness is expected to persist. Memory costs are rising, and leading component manufacturers have warned shortages could extend through 2027 and possibly longer, reinforcing that AI hardware demand continues to outpace the infrastructure needed to produce it. At the same time, competitive alternatives are becoming more visible. According to Bloomberg Intelligence, Google’s (NASDAQ:GOOG) Tensor Processing Units carry an average selling price of $8,000 to $10,000 per unit, compared with $23,000 or more for Nvidia’s H100 and $27,000 and above for its newer Blackwell system, a gap that could make diversification more appealing for some customers.
That competitive shift is beginning to show up in real contracts. Alphabet has highlighted growing internal and cloud usage of its TPUs, Amazon (NASDAQ:AMZN) has secured Anthropic for its own AI chips, and this week Meta Platforms (NASDAQ:META) and Advanced Micro Devices (NASDAQ:AMD) announced a double-digit billions agreement to supply processors for data centers, following a similar arrangement OpenAI negotiated last October that also included stock. Chief Executive Jensen Huang responded by underscoring Nvidia’s broader functionality, arguing that its GPUs can handle a wider range of AI workloads beyond just training or inference, while also pointing to power-efficiency gains when energy availability is constrained. With slightly above half of Nvidia’s $62.3 billion in data center revenue tied to AI hyperscalers, sustaining those generational technology advances could be central to preserving premium pricing, particularly as the industry waits to see whether the revenue return from agentic AI ultimately scales in line with the investment.
Terms and Privacy Policy
Privacy Dashboard
More Info