Nvidia supplier SK Hynix says its high-bandwidth memory chips used in AI processors almost sold out for 2025
- SK Hynix has already sold out its high-bandwidth memory chips for this year, as enterprises aggressively expand AI services
- The South Korean memory chip maker forecast annual demand growth for HBM chips to be about 60 per cent in the mid- to long-term
Micron has also said its HBM chips were sold out for 2024 and that most of its 2025 supply was already allocated. It plans to provide samples for its 12-layer HBM3E chips to customers in March.
“As AI functions and performance are being upgraded faster than expected, customer demand for ultra-high-performance chips such as the 12-layer chips appear to be increasing faster than for 8-layer HBM3Es,” said Jeff Kim, head of research at KB Securities.
Samsung, which plans to produce its HBM3E 12-layer chips in the second quarter, said this week that this year’s shipments of HBM chips are expected to increase more than three-fold and that it has completed supply discussions with customers. The company did not elaborate further.
Kwak of SK Hynix said investment in HBM differed from past patterns in the memory chip industry because production capacity is being increased after making certain of demand first.
By 2028, the portion of chips made for AI, such as HBM and high-capacity DRAM modules, is expected to account for 61 per cent of all memory volume in terms of value from about 5 per cent last year, SK Hynix’s head of AI infrastructure Justin Kim said.