SK hynix introduced the world’s first HBM3E memory stacks with a capacity of 48 GB from 16 chips – a new record for the 16-Hi architecture. At the SK AI Summit 2024 in Seoul, CEO Kwak Noh-Jung announced SK hynix’s strategy to become a full-line provider of DRAM and NAND memory solutions for AI.
Kwak No-Jung noted that the role of memory has undergone significant changes in recent decades: from storing data on personal computers (PCs) and smartphones to supporting the functioning of cloud services and social networks. In the future, with the development of AI, memory will play an even more important role, enabling new forms of interaction and creativity for users. SK hynix’s Creative Memory concept leverages next-generation semiconductors to deliver the powerful computing capabilities needed to handle complex tasks.
SK hynix actively strives for innovation, creating unique solutions that have no analogues. The products chosen as the basis are products from the Beyond Best category, which are highly competitive and have optimal innovations focused on the needs of AI systems. The company plans to unveil 48GB HBM3E memory samples early next year, highlighting its commitment to pioneering AI memory advances.
HBM3E with 16-Hi architecture provides up to 18% performance improvement in AI model training and up to 32% in data processing compared to 12-Hi solutions, says SK hynix. With the growing demand for AI accelerators for data processing, this solution will help SK hynix strengthen its position in the AI memory market. For mass production of 16-layer HBM3E, the advanced Advanced MR-MUF technology, previously successfully used for 12-Hi solutions, will be used.
In addition to HBM3E, SK hynix is developing solutions for other sectors, including LPCAMM2 modules for PCs and data centers, as well as energy-efficient LPDDR5 and LPDDR6 memory based on the 1c process technology.
The company also plans to integrate logic into the base die in HBM4 memory, which will be possible through a partnership with one of the leading manufacturers of logic semiconductors. This will allow SK hynix to develop customized HBM solutions tailored to specific customer needs in terms of volume, throughput and functional characteristics.
In response to the growing memory needs of AI systems, SK hynix is developing solutions based on CXL networks that will enable the integration of different types of memory into high-capacity arrays. In parallel, the company is developing eSSD with ultra-high capacity, which will allow efficient storage of large amounts of data in a limited space and with optimal power consumption.
In an effort to overcome the so-called “memory barrier”, SK hynix is developing technologies with embedded computing capabilities. Solutions such as Processing near Memory (PNM), Processing in Memory (PIM) and Computational Storage will allow you to process enormous amounts of data, minimizing latency and increasing throughput. These innovations will open up new prospects for next-generation AI systems, enabling resource-intensive tasks to be completed with minimal latency.
Supermicro reported preliminary financial results for the first fiscal quarter ended September 30. After announcing…
Microsoft is apparently preparing for a radical rebrand of all the AI-powered features introduced in…
The European Union has initiated an investigation into the American manufacturer Corning in connection with…
Finally reaching consoles and mobile devices, patch 1.6 for the farming simulator Stardew Valley brought…
Today, specialized media published reviews of the “best gaming processor in the world,” as AMD…
The quarterly reports of Intel and AMD on Form 10-Q contain summarized information about the…