Major technology companies have increased investments in artificial intelligence in 2024, with operating margins at Samsung and SK hynix expected to exceed 40%. Two Korean manufacturers have decided to expand the DRAM segment – thanks to the intensive growth of AI, it promises to surpass the rest of the semiconductor production.

Image source: skhynix.com

Samsung will build a PH3 production line for DRAM and other components at its Pyeongtaek plant in Gyeonggi Province and will suspend construction of a cleanroom for contract semiconductor manufacturing, Korea Electronic Daily has learned. The plant will produce DRAM chips for HBM memory, which will be installed on AMD AI accelerators. Samsung’s capital expenditures on DRAM, excluding building construction costs, will grow by 9.2% by the end of 2024 to reach $9.5 billion, a record high since 2020. And in 2025 they are expected to increase even more and reach $12 billion.

SK hynix was forced to cut production last year, and is now ready to more than triple its spending on the DRAM sector: in 2023 it was $2.3 billion, and in 2024 it could reach $7.1 billion. The global memory chip market is growing: the industry, which includes DRAM and NAND, is poised to nearly double this year to $175 billion—by comparison, the TSMC-dominated semiconductor manufacturing market is expected to stall at $120.3 billion. The memory chip market will top $200 billion in 2025 , of which $162 billion will fall on DRAM.

Image source: samsung.com

Samsung and SK hynix intend to invest billions of dollars in developing next-generation products, including Processing In Memory (PIM) technology and the CXL (Computer Express Link) interface. At the beginning of the year, SK hynix received a tempting offer from an AI accelerator manufacturer, which asked the Korean company to build a dedicated memory chip production line and promised to make an advance payment of more than 500 billion won ($372.41 million). But SK hynix was forced to reject this offer, since it had already committed to supply products worth 1 trillion won ($744.81 million) to Nvidia, the world leader in the AI ​​accelerator segment.

This year, 13 of the largest technology companies in the United States and China, including Google and Alibaba Group, intend to invest $226.2 billion in AI data centers, up 33.7% from 2023. In 2025, this figure will grow by an additional 13.4% to $256.6 billion. A significant portion of this budget will likely be spent on AI accelerators, which means that demand for DRAM will still remain high. Prices for HBM memory, which accounts for 26% of Samsung’s DRAM capacity and 28% of SK hynix’s, will be five or six times higher than DDR5 by the end of 2025, increasing profitability for both companies. The operating profit margin of Samsung and SK hynix will be more than 40%.

Leave a Reply

Your email address will not be published. Required fields are marked *