“Entered New Era”: SK Hynix To Build $13 Billion Memory Plant As Nvidia CEO Says AI Demand Soaring
One week after Nvidia CEO Jensen Huang told the auidence at CES Las Vegas that AI data centers are creating a “market that never existed” for memory, the global leader in high-bandwidth memory, SK Hynix, announced that construction of a new $13 billion advanced memory manufacturing plant will begin this spring, with first production targeted for the second half of 2027, as it races furiously to keep up with surging AI-driven demand.
As Nvidia’s main HBM supplier, SK Hynix sits at the center of the memory supply chain and has become a headwind for AI data center growth. With HBM prices soaring, we have been documenting the key developments across the space:
-
“Market That Never Existed”: Nvidia CEO Sparks Frenzy In Memory Stocks
-
Soaring Memory Costs Sink Nintendo Shares; Goldman Says Selloff Is Buy-The-Dip Opportunity
-
UBS Says Soaring Memory Chip Prices To “Turbo-Charge” Samsung Earnings
SK Hynix’s HBM is the stacked DRAM Nvidia uses on H100, H200, Blackwell, and every AI accelerator shipping through 2030. It’s easily half or more of the world’s HBM supplies, ahead of both Samsung and Micron.
Last week, Huang told the audience at CES, “For storage, that is a completely unserved market today. This is a market that never existed, and this market will likely be the largest storage market in the world, basically holding the working memory of the world’s AIs.”
Update: two months later https://t.co/Q4TPIrpzxV pic.twitter.com/O3KTKxZOVE
— zerohedge (@zerohedge) December 22, 2025
The SK Hynix forecast shows the HBM market is expected to grow at an average annual rate of 33% from 2025 to 2030.
“The importance of proactively responding to rising HBM demand is becoming increasingly critical,” SK Hynix wrote in a statement.
Chey Tae-won, chairman of SK Hynix parent SK Group, warned of tight supplies late last year. “We have entered an era in which supply is facing a bottleneck. We are receiving memory chip supply requests from many companies, and we are thinking hard about how to address all demands.”





R1
T1


