AMD announced its new Instinct MI series AI accelerators at Computex 2024 earlier this week, but now we’re hearing the company is cozying up to Samsung to get its fix of HBM memory for its upcoming AI accelerators.
VIEW GALLERY – 2 IMAGES
During AMD’s Computex 2024 keynote speed, CEO Dr. Lisa Su met with Seoul Economic Daily to talk about their HBM collaboration with South Korean giant Samsung. Samsung has now been confirmed to provide AMD with its fifth-generation HBM memory, also known as HBM3E.
On the same day, AMD announced its upcoming Instinct MI325X AI accelerator, which will feature a monster 288GB of HBM3E memory, that we now know will be provided by Samsung. Back at ISSCC 2023, the world’s largest semiconductor technology conference held in San Francisco, California, AMD CEO Dr. Lisa Su announced it was working with Samsung on developing HBM-PIM (processing-in-memory).
Su highlighted some of the benefits of PIM technology, such as reducing power consumption by an incredible 85% compared to conventional memory processing.
AMD AI GPU competitor and AI GPU leader — NVIDIA — is working closely with SK hynix on its HBM3 and HBM3E memory, which it uses across its Hopper AI GPUs and new Blackwell AI GPUs. The fight for AI GPUs is real, and getting the fastest HBM on the planet is a key for all companies, without it… well, there would be no AI GPUs of this magnitude right now. We’ll keep an eye on this story as it progresses.