South Korean company SK hynix, the world’s second largest memory chip manufacturer, will begin mass production of 12-layer HBM3E high-speed memory stacks by the end of this month. Reuters writes about this with reference to a statement by one of the manufacturer’s senior managers.
The announcement of the imminent start of mass production of HBM3E memory was made by Justin Kim, president and head of SK hynix’s AI infrastructure division, speaking at the Semicon Taiwan industry forum in Taipei.
Back in July, SK hynix revealed plans to supply new generation HBM memory chips (12-layer HBM3E). The manufacturer reported that mass deliveries of such chips are planned for the fourth quarter of this year, and HBM4 memory will begin in the second half of 2025.
High Bandwidth Memory (HBM) is a type of dynamic random access memory (DRAM). The standard was first introduced in 2013. To save space and reduce power consumption, these memory chips consist of multiple dies stacked on top of each other. The main use of such memory stacks today is specialized graphics accelerators for working with artificial intelligence (AI) algorithms. Using high-bandwidth memory allows you to process the huge volumes of data needed to train AI models.
In May, SK hynix CEO Kwak Noh-Jung said the company had booked HBM memory production orders for all of 2024 and had nearly completed all of 2025.
In addition to SK hynix, key suppliers of HBM memory are Micron and Samsung Electronics. SK hynix is Nvidia’s main supplier of HBM chips. At the end of March, the manufacturer delivered samples of HBM3E memory to one of its customers, but SK hynix refused to specify who exactly.