New chips mark a strategic push to close the gap with SK Hynix in the fast-growing market for AI data center components.
Samsung Electronics announced on Thursday that it has started shipping its most advanced HBM4 (High Bandwidth Memory) chips to customers, although it did not disclose their identities. The move comes amid a global surge in demand for memory components that power AI data centers, where high-speed data transfer is essential for training and running large-scale artificial intelligence models.
HBM chips are critical for feeding massive volumes of data to AI accelerators—specialized processors used by companies like Nvidia to handle intensive AI workloads. As the race to build AI infrastructure intensifies, the market for high-performance memory has become increasingly competitive, with Samsung seeking to close the gap with its main rival, SK Hynix, which has held a dominant position in the space.
Samsung has faced criticism for being slow to respond to the rapidly evolving HBM market. The company lagged behind competitors in supplying previous-generation HBM chips, and SK Hynix has maintained a leading position in both production capacity and market share.
However, Samsung’s latest announcement signals a strategic push to challenge that dominance. At a presentation on Wednesday, Song Jai-hyuk, chief technology officer for Samsung’s chip division, highlighted strong customer feedback on the performance of the new HBM4 chips, describing it as “very satisfactory.”
What Makes HBM4 Important?
Samsung said its HBM4 chips deliver a consistent processing speed of 11.7 gigabits-per-second (Gbps), representing a 22% improvement over its previous HBM3E generation. The company also noted that the chips can reach a maximum speed of 13 Gbps, a capability designed to reduce growing data bottlenecks in AI systems.
The speed gains are significant because modern AI models require massive amounts of data to be moved quickly between memory and processors. HBM4’s faster throughput can directly impact the efficiency and performance of AI training and inference.
Samsung also confirmed plans to release samples of next-generation HBM4E chips in the second half of this year, signaling a continued effort to advance its technology roadmap.
Market Reaction and Competitive Landscape
The announcement appeared to have a positive impact on the stock market. Samsung shares closed up 6.4% on Thursday, while SK Hynix shares rose 3.3%.
The competition in the HBM sector has been intensifying:
- SK Hynix has stated it aims to maintain its “overwhelming” market share in next-generation HBM4 chips, which are already in volume production. The company also said it is working to achieve production yields for HBM4 comparable to its current HBM3E chips.
- Micron, another major memory chipmaker, has also reportedly entered high-volume production of HBM4 and has begun customer shipments, according to media reports.
What This Means for AI Infrastructure
As AI workloads continue to expand, the demand for faster, more efficient memory solutions will only grow. Samsung’s entry into HBM4 shipments could intensify competition and help lower costs or accelerate innovation in AI hardware.
The battle for dominance in HBM technology is shaping up to be a critical element of the broader AI supply chain, as data centers worldwide scale up to support the next generation of artificial intelligence systems.
