close
close

SK hynix announces world’s first 48GB 16-Hi HBM3E memory – Next generation PCIe 6.0 SSDs and UFS 5.0 storage also in the works

SK hynix announces world’s first 48GB 16-Hi HBM3E memory – Next generation PCIe 6.0 SSDs and UFS 5.0 storage also in the works

When you buy from our articles through links, Future and its syndication partners may earn a commission.

    CEO of SK hynix.     CEO of SK hynix.

At the SK AI Summit 2024, SK hynix The CEO took the stage and unveiled the industry’s first 16-Hi HBM3E memory, beating both Samsung and Micron. With HBM4 development going strong, SK hynix has prepared a 16-layer version of its HBM3E offering to ensure “technological stability” and aims to offer samples as early as next year.

A few weeks ago SK hynix unveiled a 12-Hi variant of its HBM3E memory, securing contracts from AMD (MI325X) and Nvidia (Blackwell Ultra). Rake record profits Last quarter, SK hynix was back in full force, as the giant just announced a 16-layer upgrade to its HBM3E series, with a capacity of 48 GB (3 GB per individual die) per stack. With this increase in density, AI accelerators can now accommodate up to 384 GB of HBM3E memory in an 8-stack configuration.

SK hynix claims an 18% improvement in training, in addition to a 32% improvement in inference performance. Like its 12-Hi counterpart, the new 16-Hi HBM3E memory features packaging technologies such as MR-MUF, which connects chips together by melting the solder between them. SK hynix expects 16-Hi HBM3E samples to be ready in early 2025. However, this memory may be short-lived because Nvidia’s next generation Rubin chips are planned for mass production later next year and will be based on HBM4.

SK hynix HBM3E 16-HiSK hynix HBM3E 16-Hi

SK hynix HBM3E 16-Hi

That’s not all, as the company is actively working on PCIe 6.0 SSDs, high-capacity QLC (Quad Level Cell) eSSDs aimed at AI servers, and UFS 5.0 for mobile devices. In addition, to power future laptops and even handhelds, SK Hynix is ​​developing an LPCAMM2 module and soldered LPDDR5/6 memory using the 1cnm node. There’s no mention of CAMM2 modules for desktops, so PC folks will have to wait – at least until CAMM2 adoption matures.

To overcome what SK hynix calls a ‘memory wall’, the memory maker is developing solutions such as Processing Near Memory (PNM), Processing in Memory (PIM) and Computational Storage. Samsung has already demonstrated its version of PIM – which processes data in memory, so data does not have to be moved to an external processor.

HBM4 doubles the channel width from 1024 bits to 2048 bits and supports more than 16 vertically stacked DRAM chips (16-Hi) – each with up to 4 GB of memory. Those are some monumental upgrades, generation over generation, and should be enough to meet the high memory requirements of upcoming AI GPUs.

From Samsung HBM4 tape-out will take place later this year. On the other hand, reports suggest that SK hynix has already reached the tape-out phase in October. After a traditional silicon development life cycle, Nvidia and AMD may receive qualifying samples in the first/second quarter next year.

The SK AI Summit 2024 will be held at the COEX Convention Center in Seoul from November 4 to 5. The event is the largest AI symposium in Korea, the company claimed.