Key Takeaways
Nvidia relies on the latest high-bandwidth memory (HBM) chips to build its most powerful graphics processing units (GPUs).
As demand for Nvidia GPUs has increased in recent years, its suppliers have struggled to keep up.
For South Korean memory chip rivals SK Hynix and Samsung, Nvidia contracts can provide significant revenues and promise future growth. As they compete for these lucrative deals, both companies are racing to develop better HBM solutions and scale their manufacturing capacity.
While Nvidia still uses an alternative memory technology known as graphics double data rate (GDDR) for its smaller GPUs, it favors HBM for the heavyweight processors used in AI training.
A crucial difference between the two chip families is that while GDDR memory is typically positioned separately on the GPU circuit board, HBM can be placed next to the processor itself. This proximity to the processor gives HBM its primary advantage—speed.
As Nvidia has accelerated GPU development, its memory needs have increased, and the firm has called on suppliers to improve their game.
For instance, CEO Jensen Huang reportedly asked SK Hynix to accelerate the production of more advanced HBM4 chips by six months.
Originally slated to hit mass production in 2026, SK Hynix reportedly plans to expedite its HBM roadmap to ship the first chips in the second half of 2025—just in time for the anticipated launch of Nvidia’s next-generation Rubin R100 GPU.
Meanwhile, Samsung has also brought forward its HBM4 timeline, with a similar target to hit mass production in late 2025.
When it started incorporating HBM into its GPU designs, Nvidia originally favored SK Hynix chips. Media reports suggest Samsung’s chips didn’t meet Nvidia’s standards for use in its AI processors due to problems with heat and power consumption.
However, in July this year, Nvidia reportedly approved Samsung’s HBM3 for use in H20 processors, a slimmed-down version of its flagship H200 made for the Chinese market.
Working with Samsung could help Nvidia ramp up GPU production by addressing supply chain bottlenecks. Meanwhile, Samsung investors see an expanded Nvidia partnership as a potential cash cow.
During its recent quarterly earnings call, Samsung reported strong demand for HBM, driven by increased investments in AI infrastructure.
“The Company plans to actively respond to the demand for high-value-added products for AI and will expand capacity to increase the portion of HBM3E sales,” it said in a statement.