News

We have been developing global business,
as a Chinese semiconductor supplier, we take the world service as the corporate vision
Is the Memory Market Truly Facing a Shortage?
2025-11-12


How AI Is Reshaping the Memory Supply Landscape



 From Boom–Bust Cycles to AI-Driven Demand

The memory chip market — once famous for its volatility — is entering what many analysts call a “supercycle.”


AI powerhouses like NVIDIA and OpenAI have turned memory from a commodity into critical infrastructure.


Leading suppliers — Samsung Electronics, SK Hynix, and Micron Technology — are all seeing demand surge for DRAM and high-bandwidth memory (HBM) used in AI training and inference systems.


TrendForce projects global DRAM revenue to hit US $231 billion in 2024, nearly four times higher than in 2023’s trough.



 Record Profits and “Sold-Out” Capacity

The financial results speak volumes:

  • 1. SK Hynix reports its 2025 production capacity is already fully booked.

  • 2. Samsung’s chip profits soared 80% year-on-year.

  • 3. Micron’s net profit tripled in the latest quarter.

As SK Hynix puts it, the market has entered a “super-boom phase.”


What’s Driving the Surge: HBM and AI Infrastructure


At the center of this growth is high-bandwidth memory (HBM) — a specialized form of DRAM stacked vertically and paired with GPUs to handle massive parallel data flows.


HBM + GPU = the computing heart of AI servers.

This combination allows data centers to train and deploy models like ChatGPT at unprecedented speed.


In October, OpenAI signed memoranda with Samsung and SK Hynix for its Stargate project, naming them as key partners for advanced memory and data infrastructure.


 SK Hynix revealed OpenAI’s DRAM wafer demand could reach 900,000 per month — more than double today’s global HBM capacity.



Traditional Memory Joins the Shortage

Here’s the twist: not only HBM is tight.
Conventional DRAM used in standard servers is also in short supply.


Hyperscalers like Amazon, Google, and Meta are stockpiling to expand traditional data centers — even as most manufacturers have prioritized HBM production.


As Peter Lee, semiconductor analyst at Citigroup, explains:

“Not every AI workload requires HBM. For inference and data retrieval, traditional DRAM can be more cost-efficient.”

That’s why both AI servers and general-purpose servers are now competing for limited memory inventory.



 A “New Paradigm” for Memory Procurement

SK Hynix CFO Woo Hyun Kim notes that AI innovation has triggered a fundamental shift in demand:

“The memory market has entered a new paradigm — demand is now expanding across all product lines.”

The company expects the HBM market to grow more than 30% annually over the next five years, even under conservative forecasts.


For sourcing teams, this means:
1. Longer lead times for DRAM and HBM.
2. Rising contract prices through 2026.
3. A stronger focus on supplier diversification and long-term partnerships.



 The Big Question: Can Supply Keep Up?

Analysts warn that even with record investments, supply might lag far behind demand.


TrendForce expects tight conditions to persist through 2026, possibly into early 2027.


Sanjeev Rana from CLSA cautions:

“The figures are staggering. Current and planned capacity can’t meet this level of demand. The uncertainty is whether OpenAI can actually execute its plans.”



 What This Means for Procurement Professionals

AI has redefined memory chips from cyclical commodities to strategic assets.


If hyperscaler infrastructure builds continue as planned, we may be witnessing memory’s longest and most profitable upcycle in decades — but also one of its tightest supply environments.


Procurement teams who anticipate this shift early — by securing long-term supply agreements, monitoring HBM yield trends, and understanding AI server build plans — will have a clear competitive edge.