Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
From 8% to 30%! Memory expenditure skyrocketing
Semiconductor Industry Watch
SemiAnalysis projects that by 2026, memory spending will account for about 30% of total capital expenditures for hyperscale data centers, far higher than 8% in 2023; this ratio is expected to rise further in 2027. In just four short years, as DRAM prices have surged to levels that are hard to imagine, while HBM supply remains severely constrained, the share of memory spending will nearly quadruple.
Evidently, this is driven by demand for artificial intelligence.
SemiAnalysis projects that by 2026, memory spending will account for about 30% of total capital expenditures for hyperscale data centers, higher than the roughly 8% in 2023 and 2024. The company expects that this ratio will rise further in 2027, meaning that within just four years, as DRAM prices surge to levels that are hard to imagine and HBM supply remains severely short, the share of memory spending will increase by nearly fourfold.
SemiAnalysis projects that DRAM prices will more than double in 2026, and that average selling prices will also rise by double-digit percentages in 2027. Contract prices for LPDDR5 have already risen more than threefold since the first quarter of 2025. The company estimates that spot prices in the public markets may exceed $10/GB in this quarter.
Based on SemiAnalysis’s survey results, vertical stacked memory HBM—key to AI accelerators—will still face a supply shortfall in 2027, while memory currently represents a large portion of the projected $250 billion of new spending for hyperscale data centers this year.
This has already shown up in the pricing of AI servers. SemiAnalysis notes that heavily impacted by rising memory costs, B200 server prices are expected to increase by as much as 20% by the end of the year. This aligns with the broader industry trend: manufacturers, during recent earnings call conferences, acknowledged sharp increases in component costs. Dell COO Jeff Clark described the pace of cost increases as “unprecedented” during Dell’s FY2025 Q3 earnings call conference in November last year.
Counterpoint Research previously predicted separately that by the end of 2026, the price of DDR5 64GB RDIMM modules could be twice that of early 2025. AI servers built on Nvidia’s LPDDR platform have the most significant price increases, because the amount of memory required per system is substantial.
SemiAnalysis points to an interesting phenomenon: Nvidia obtains so-called “VVP” (Very Very Preferred) DRAM pricing from suppliers—“far below the prices paid by hyperscale data centers and the entire market.” SemiAnalysis believes this compresses Nvidia’s own server costs and lowers the overall market pricing benchmark, masking severe supply shortages faced by other users in practice.
AMD’s situation is the opposite. Its AI accelerator SKUs typically have higher unit prices, so they cannot benefit from the same supplier discount pricing. With AMD’s AI accelerator sales far lower than Nvidia’s, this means that AMD is “more easily affected by rising memory costs when its AI accelerator scale is far smaller than Nvidia’s.” In other words, Nvidia’s purchasing scale in HBM and traditional DRAM gives it an advantage that small-batch buyers cannot access.
SemiAnalysis’s conclusion is that although major cloud operators have partially reflected rising memory prices in their 2026 capital expenditure guidance, Wall Street’s expectations have not yet reflected the re-pricing in 2027. Samsung, SK hynix, and Micron have all shifted capacity toward HBM and high-margin enterprise DRAM, resulting in constrained supply of conventional DDR5 and LPDDR5. In addition, Micron’s Hiroshima HBM plant with an investment of $9.6 billion, as well as SK hynix’s expansion projects in Lichuan and Qingzhou, will not begin substantial production until at the earliest 2027 or 2028.
A wealth of information and precise analysis—right in the Sina Finance app
Responsible editor: Lingchen