
- Samsung begins commercial HBM4 shipments as AI memory competition heats up
- HBM4 reaches 11.7Gbps speeds while pushing higher bandwidth and efficiency gains for data centers
- Samsung scales production plans with roadmap extending to HBM4E and custom memory variants
Samsung says it has not only begun mass production of HBM4 memory, but also shipped the first commercial units to customers, claiming an industry first for the new high bandwidth memory generation.
HBM4 is built on Samsung’s sixth generation 10nm-class DRAM process and uses a 4nm logic base die, which reportedly helped the South Korean memory giant reach stable yields without redesigns as production ramped up.
That’s a technical claim which will likely be tested once large scale deployments start and independent performance results appear.
Up to 48GB capacity
The new memory reaches a consistent transfer speed of 11.7Gbps, with headroom up to 13Gbps in certain configurations.
Samsung compares that with an industry baseline of 8Gbps, putting HBM3E at 9.6Gbps. Total memory bandwidth climbs to 3.3TB/s per stack, which works out to roughly 2.7 times higher than its earlier generation.
Capacity ranges from 24GB to 36GB in 12-layer stacks, with 16-layer versions coming later. This could increase capacity to 48GB for customers that need denser configurations.
Power use is a key issue as HBM designs increase pin counts, and this generation moves from 1,024 to 2,048 pins.
Samsung says it improved power efficiency by about 40% compared with HBM3E via low-voltage through-silicon-via technology and power distribution tweaks, alongside thermal changes that increase heat dissipation and resistance.
“Instead of taking the conventional path of utilizing existing proven designs, Samsung took the leap and adopted the most advanced nodes like the 1c DRAM and 4nm logic process for HBM4,” said Sang Joon Hwang, EVP and Head of Memory Development at Samsung Electronics.
“By leveraging our process competitiveness and design optimization, we are able to secure substantial performance headroom, enabling us to satisfy our customers’ escalating demands for higher performance, when they need them.”
The company also points to its manufacturing scale and in-house packaging as key reasons it can meet expected demand growth.
That includes closer coordination between foundry and memory teams as well as partnerships with GPU makers and hyperscalers building custom AI hardware.
Samsung says it expects its HBM business to grow sharply across 2026, with HBM4E sampling planned for later in the year and custom HBM samples set to follow in 2027.
Whether competitors respond with similar timelines or faster alternatives will shape how long this early lead lasts.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
https://cdn.mos.cms.futurecdn.net/TWmT2UiDCUTRuXM3khtQrc-1000-80.jpg
Source link
waynewilliams@onmail.com (Wayne Williams)




