News

Micron Just Sold Out Its Entire 2026 HBM Capacity

Micron's HBM4 enters volume production ahead of schedule, 2026 capacity is 100% sold out, and the stock hit a record $410. Here's what it means for the AI memory race.

2/15/2026

$MU jumped 9% on February 11 and closed at a record $410. The reason: Micron just confirmed HBM4 is in volume production — a full quarter early — and every byte of its 2026 capacity is already spoken for.

That's not "strong demand." That's sold out. Done. Come back in 2027.

What happened

At the Wolfe Research Conference on February 11, CFO Mark Murphy systematically demolished every bear thesis. Three data points that moved the stock:

  1. HBM4 is in volume production. Not sampling, not pilot lines. Volume. A quarter ahead of the original timeline.
  2. 2026 HBM capacity is 100% sold out. Multi-year agreements, locked-in pricing. This isn't cyclical memory anymore — it's recurring revenue with fat margins.
  3. Pin speeds hit 11.7 Gbps. The benchmark was 11 Gbps. Micron beat it, which kills the rumor that they couldn't meet NVIDIA's specs for Vera Rubin.

CHART_PLACEHOLDER_0

That last point wasn't just an incremental win — it was a direct rebuttal to months of skepticism. Bears claimed Samsung and SK Hynix had HBM locked up, and Micron was a distant third playing catch-up. Murphy's numbers flipped the script: Micron leapfrogged intermediate HBM3E steps and went straight to HBM4 with specs that exceed NVIDIA's requirements.

CHART_PLACEHOLDER_1

The numbers

Metric Detail
Stock move +9% on Feb 11, record $410, trading $409-$413 since
HBM4 pin speed 11.7 Gbps (vs. 11 Gbps target)
2026 HBM capacity 100% sold out
Process node 1-beta (1β) DRAM
Interface width 2048-bit (2x previous gen)
Trading volume ~3x daily average

Why it matters

The AI infrastructure buildout runs on three things: GPUs (NVIDIA), networking (Broadcom, Cisco), and memory (Micron, SK Hynix, Samsung). GPUs get the headlines, but without HBM, those GPUs are paperweights. Every NVIDIA Blackwell Ultra and Vera Rubin chip needs stacks of high-bandwidth memory to function.

Micron going from "maybe they can compete" to "sold out for the year" changes the supply picture. NVIDIA and AMD now have a reliable third HBM supplier, which reduces their dependency on Korean fabs and de-risks their own production targets.

For Micron specifically, the shift is structural. Memory has always been a brutal cyclical business — boom, bust, repeat. But HBM operates differently. Long-term contracts, locked pricing, and capacity that sells out before it's built. That's closer to TSMC's foundry model than the old DRAM spot market. Analysts at Morgan Stanley revised price targets upward the same day.

The bear case

Two things to watch. First, double-ordering. When capacity is this tight, customers sometimes book more than they need to secure supply. If AI spending slows and those orders get pulled, Micron's "sold out" narrative unwinds fast.

Second, HBM margins are great today because demand outstrips supply. Samsung and SK Hynix are both ramping their own HBM4 production. By late 2026 or 2027, the supply picture could look very different.

Bottom line

Micron went from the weakest player in the HBM race to the one shipping a quarter early with specs that exceed targets. At $410, the stock prices in a lot of good news. But "sold out through 2026" is the kind of visibility most semiconductor companies would kill for. If AI capex stays on the current trajectory — and every hyperscaler says it will — Micron's position only gets stronger from here.

Stocks mentioned

MUMU