close
close

SK Hynix pumps billions into HBM chips to meet AI demand

High-bandwidth memory (HBM) is becoming a key technology in the ongoing race to invest in AI, as SK hynix plans to spend billions on memory chip production and China’s Huawei seeks to develop its own in partnership with a local foundry.

SK hynix, the world’s second-largest memory chip maker, is set to invest 103 trillion won ($74.5 billion) to strengthen its semiconductor division by 2028, the company said after a management strategy meeting at the end of June.

According to its investment plan, 80% of that total (82 trillion yen, or about $60 billion) is to be spent on AI-related business areas such as HBM, BusinessKorea said, increasing production capacity to meet growing demand.

As The register As announced some time ago, SK hynix has already sold all of the HBM chips it will make this year as well as most of its planned production for 2025, due to demand driven by the AI ​​craze. The latter is partly because its HBM chips are optimized for use with Nvidia’s high-end GPU accelerators, and the company was one of the first players in the HBM market.

HBM was developed as a way to increase memory bandwidth for key applications by placing chips in the same package as CPU or GPU chips, sometimes directly stacked on top of them so that connections are much shorter. Our colleagues at Blocks and Files have an explainer on HBM.

There have been warnings that the industry’s enthusiasm for HBM could potentially cause a shortage of DRAM supply unless more manufacturing lines can be brought on stream, as demand for the memory is expected to grow 200% this year and double again in 2025.

Memory giant Samsung also hopes to benefit from the AI ​​memory boom, but is still reportedly waiting for its HBM chips to be certified with Nvidia’s GPU accelerators. The company was recently forced to deny rumors that its chips would not meet Nvidia’s power consumption and heat needs.

Huawei struggles to increase GPU production due to US sanctions

LEARN MORE

Micron, the third-largest memory maker, also said its HBM production capacity had been exhausted through 2025 in the company’s recent financial report for the third quarter of fiscal 24. The Boise, Wash.-based company Idaho, said it is “well positioned to generate substantial revenue” in its 2025 fiscal year, thanks to AI driving demand for memory chips.

However, Micron also revealed that the new fabs it is building in the US will not be operational to contribute to its memory supply until FY27 in the case of Boise, while the New York fab is not expected to begin shipping until FY28 or later.

Meanwhile, it has been reported that Chinese tech giant Huawei is seeking to secure its own HBM chips in partnership with Wuhan Xinxin Semiconductor Manufacturing in order to circumvent US sanctions that have closed its access to many components manufactured outside China.

According to the South China Morning Post, the initiative involves several other Chinese companies, such as packaging companies Jiangsu Changjiang Electronics and Tongfu Microelectronics, which would provide an equivalent to the Chip on Wafer on Substrate (CoWoS) technique used to combine Nvidia’s GPUs with HBM chips.

It was previously reported that a company called ChangXin Memory Technologies (CXMT) is set to become China’s first producer of HBM modules. ®