NVIDIA is qualifying Samsung’s new HBM3E chips, will use them for future B200 AI GPUs

NVIDIA CEO Jensen Huang told the press during a media briefing at GTC 2024 that “HBM memory is very complicated and the value added is very high. We are spending a lot of money on HBM“. Jensen added: “Samsung is very good, a very good company“.

NVIDIA is qualifying Samsung's new HBM3E chips, will use them for future B200 AI GPUs 05

VIEW GALLERY – 2 IMAGES

SK hynix gobbles up most of the advanced HBM3 and HBM3E memory needs for NVIDIA and its growing arsenal of AI GPUs with the Hopper H100, H200, and new Blackwell B100 and B200 AI GPUs all using HBM memory. Jensen continued: “The upgrade cycle for Samsung and SK Hynix is incredible. As soon as NVIDIA starts growing, they grow with us. I value our partnership with SK Hynix and Samsung very incredibly“.

The news directly from the CEO of NVIDIA that it will be using HBM memory supplied by Samsung saw the South Korean company’s shares jump by 5.6% on Wednesday.

Samsung recently showed off its new GDDR7 memory at 32Gbps at GTC 2024, with 28Gbps GDDR7 memory expected to debut inside of NVIDIA’s next-gen Blackwell-based GeForce RTX 50 series GPUs. Samsung also worked with NVIDIA on technical verification for its fourth-generation HBM3 memory chips and packaging services last summer.

South Korean HBM rival SK hynix announced mass production of its next-generation HBM3E memory chips this week, providing NVIDIA with HBM3E for its new Blackwell B200 AI GPUs. SK hynix had its HBM3E memory chips sampled to NVIDIA last year, so they’re ahead of Samsung by months and months at this point.