SK hynix introduced the world’s first HBM3E memory stacks with a capacity of 48 GB from 16 chips – a new record for the 16-Hi architecture. At the SK AI Summit 2024 in Seoul, CEO Kwak Noh-Jung announced SK hynix’s strategy to become a full-line provider of DRAM and NAND memory solutions for AI.

Image source: SK hynix

Kwak No-Jung noted that the role of memory has undergone significant changes in recent decades: from storing data on personal computers (PCs) and smartphones to supporting the functioning of cloud services and social networks. In the future, with the development of AI, memory will play an even more important role, enabling new forms of interaction and creativity for users. SK hynix’s Creative Memory concept leverages next-generation semiconductors to deliver the powerful computing capabilities needed to handle complex tasks.

SK hynix actively strives for innovation, creating unique solutions that have no analogues. The products chosen as the basis are products from the Beyond Best category, which are highly competitive and have optimal innovations focused on the needs of AI systems. The company plans to unveil 48GB HBM3E memory samples early next year, highlighting its commitment to pioneering AI memory advances.

HBM3E with 16-Hi architecture provides up to 18% performance improvement in AI model training and up to 32% in data processing compared to 12-Hi solutions, says SK hynix. With the growing demand for AI accelerators for data processing, this solution will help SK hynix strengthen its position in the AI ​​memory market. For mass production of 16-layer HBM3E, the advanced Advanced MR-MUF technology, previously successfully used for 12-Hi solutions, will be used.

In addition to HBM3E, SK hynix is ​​developing solutions for other sectors, including LPCAMM2 modules for PCs and data centers, as well as energy-efficient LPDDR5 and LPDDR6 memory based on the 1c process technology.

The company also plans to integrate logic into the base die in HBM4 memory, which will be possible through a partnership with one of the leading manufacturers of logic semiconductors. This will allow SK hynix to develop customized HBM solutions tailored to specific customer needs in terms of volume, throughput and functional characteristics.

HBM3E memory delivers 9.2 Gbps per pin data transfer speed, resulting in bandwidth greater than 1.2 TB/s

In response to the growing memory needs of AI systems, SK hynix is ​​developing solutions based on CXL networks that will enable the integration of different types of memory into high-capacity arrays. In parallel, the company is developing eSSD with ultra-high capacity, which will allow efficient storage of large amounts of data in a limited space and with optimal power consumption.

In an effort to overcome the so-called “memory barrier”, SK hynix is ​​developing technologies with embedded computing capabilities. Solutions such as Processing near Memory (PNM), Processing in Memory (PIM) and Computational Storage will allow you to process enormous amounts of data, minimizing latency and increasing throughput. These innovations will open up new prospects for next-generation AI systems, enabling resource-intensive tasks to be completed with minimal latency.

Leave a Reply

Your email address will not be published. Required fields are marked *