Storage Chips: A New Competition
Release time:
2026-02-28
Source: Compiled and translated from businesskorea
Low-Power Double Data Rate (LPDDR) memory is emerging as a rising star in the artificial intelligence (AI) memory market. It is attracting significant attention as an alternative solution, expected to address AI bottlenecks that cannot be fully resolved by High-Bandwidth Memory (HBM) alone, while delivering superior energy efficiency. Samsung Electronics and SK Hynix have already launched their next-generation LPDDR products, joining the competition for market share.
According to industry insiders on February 24, Samsung Electronics and SK Hynix unveiled their LPDDR6 development results at the 2026 IEEE International Solid-State Circuits Conference (ISSCC) held in San Francisco, USA, from February 15 to 19 (local time). ISSCC is widely regarded as the most prestigious conference in the field of semiconductor design.
Both companies demonstrated LPDDR6 memory compliant with JEDEC (Joint Electron Device Engineering Council) standards, featuring a maximum transfer speed of 14.4 Gb/s. This represents approximately a 35% speed increase compared to the previous-generation LPDDR5X, which tops out at 10.7 Gb/s. LPDDR6 is expected to launch in the second half of this year.
Samsung Electronics focused heavily on enhancing energy efficiency. Most circuits in its design operate reliably even at ultra-low voltages, reducing read power consumption by 27% compared to LPDDR5. At a minimum voltage of 0.97V, it achieves a transfer speed of 12.8 Gb/s, and structural optimizations have improved stability in real-world operating environments.
SK Hynix implemented a High-Efficiency Mode to reduce power consumption while maintaining a transfer speed of 12.8 Gb/s. The company stated that the product delivers 10.9 Gb/s in low-voltage environments and maintains stable performance even at the maximum speed of 14.4 Gb/s. Additionally, SK Hynix showcased power control technology that only supplies the required amount of power.
Traditionally, LPDDR has been primarily used in mobile devices such as smartphones, where high energy efficiency is critical. However, with the expansion of AI infrastructure, its application scope has grown rapidly in recent years. LPDDR is gaining attention because it compensates for the capacity limitations of individual HBM modules while offering excellent energy efficiency. Using LPDDR allows system-level memory capacity expansion, improves data center energy efficiency, and lowers the Total Cost of Ownership (TCO).
Specifically, AI inference generates temporary data known as Key-Value (KV) cache to maintain conversational context. As context length increases, KV cache capacity grows rapidly, potentially exceeding the HBM memory capacity installed in GPUs (Graphics Processing Units) and creating bottlenecks. For this reason, designs combining the high bandwidth of HBM with the high capacity and low power consumption of LPDDR are currently being explored.
Furthermore, compared to standard DRAM, LPDDR offers greater power and space efficiency, enabling memory capacity expansion without major modifications to existing system architectures. Samsung and SK Hynix have also integrated a metadata region into LPDDR6 to improve data management efficiency and boost AI computing performance.
For example, NVIDIA’s GH200 system connects LPDDR5X memory in the Grace CPU (Central Processing Unit) with HBM3E memory in the Hopper GPU. NVIDIA’s upcoming Vera CPU, scheduled for release in the second half of this year, is expected to feature 1.5 TB of LPDDR5X memory.
According to Micron Technology’s research, expanding LPDDR5X memory capacity from 512 GB to 1.5 TB can reduce Time to First Token (TTFT) in real-time inference environments by up to 98%. Increased memory capacity reduces the need for recomputation, thereby improving initial response speed.
One industry insider stated: “HBM alone cannot meet all the rapidly growing memory demands.” He added: “As demand for server DRAM and LPDDR5X grows in tandem with HBM, this will improve overall profitability for related memory products.”
*Disclaimer: This article is created by the original author. The content reflects the author’s personal opinions only. Our reprint is solely for the purpose of conveying a different perspective and does not imply our endorsement or support of such opinions. For any objections, please contact our backend team