SK hynix pumps billions into HBM chips to meet AI demand

SK hynix pumps billions into HBM chips to meet AI demand

SK hynix pumps billions into HBM chips to meet AI demand PlatoBlockchain Data Intelligence. Vertical Search. Ai.

High bandwidth memory (HBM) is becoming a key technology in the continued AI investment race as SK hynix plans to spend billions on memory chip production and China’s Huawei looks to develop its own in partnership with a local foundry.

SK hynix, the world’s second biggest memory chipmaker, is set to invest ₩103 trillion ($74.5 billion) in boosting its semiconductor division between now and 2028, the company announced after a management strategy meeting at the end of June.

According to its investment plan, 80 percent of that total (₩82 trillion or about $60 billion) is to be directed towards AI-related business areas such as HBM, BusinessKorea said, increasing production capacity to respond to growing demand.

As The Register reported a while back, SK hynix has already sold all the HBM chips it will manufacture this year as well as most of its expected 2025 production, owing to demand driven by the AI craze. The latter is partly because its HBM chips are optimized for use with Nvidia’s top-end GPU accelerators, and the company was an early mover in the HBM market.

HBM was developed as a way of boosting memory bandwidth for key applications by putting the chips inside the same package as the CPU or GPU chips, sometimes directly stacked on top of those so the connections are much shorter. Our colleagues over at Blocks & Files have an explainer on HBM.

There were warnings that industry enthusiasm for HBM has the potential to cause a DRAM supply shortage unless more manufacturing lines can be brought into play as demand for this memory is expected to grow 200 percent this year and double again in 2025.

Memory giant Samsung is also expecting to cash in on the AI memory boom, but is still said to be waiting for its HBM chips to be certified with Nvidia’s GPU accelerators. The company was recently forced to deny rumors that its chips were failing to meet Nvidia’s power consumption and heat requirements.

Micron, the third-largest memory maker, also said that its HBM production capacity has been sold out through 2025 during the company’s recent Q3 FY24 financial report. The Boise, Idaho outfit said that it was “well positioned to deliver a substantial revenue record” in its fiscal 2025, thanks to AI driving demand for memory chips.

However, Micron also disclosed that the new fabrication plants it is building in the US will not come on-stream to contribute to its memory supply until FY27 in the case of Boise, while the New York facility is not expected to start delivering until FY28 or later.

Meanwhile, it has been reported that Chinese tech giant Huawei is looking to secure its own HBM chips in partnership with Wuhan Xinxin Semiconductor Manufacturing as a way of circumventing US sanctions that have closed off its access to many components made outside China.

According to South China Morning Post, this initiative involves several other Chinese companies, such as packaging firms Jiangsu Changjiang Electronics and Tongfu Microelectronics, which are said to be providing an equivalent to the Chip on Wafer on Substrate (CoWoS) technique used to combine Nvidia’s GPUs with HBM chips.

It was previously reported that a company called ChangXin Memory Technologies (CXMT) was expected to become China’s first homegrown producer of HBM modules. ®

Time Stamp:

More from The Register