AI startup TensorWave is one of the first with a publically deployed setup powered by AMD Instinct MI300X AI accelerators, and its CEO says they’re a far better option than NVIDIA’s dominant Hopper H100 AI GPU.
TensorWave started racking up AI systems powered by AMD’s just-released Instinct MI300X AI accelerator, which it plans to lease the MI300X chips at a fraction of the cost of NVIDIA’s Hopper H100 AI GPU. TensorWave plans to have 20,000 of AMD’s new Instinct MI300X AI accelerators before the end of the year across two facilities, and has plans to have liquid-cooled systems online in 2025.
Jeff Tatarchuk said that AMD’s new Instinct MI300X AI GPU in “just raw specs, the MI300X dominates H100,” and he’s not wrong. But that’s the original H100 with 40GB and 80GB HBM3 options, while a beefed-up H200 with a much larger 141GB of ultra-fast HBM3e memory with up to 4.8TB/sec of memory bandwidth, but once Blackwell B200 is here, NVIDIA owns the AI GPU game with 192TB of HBM3e at 8TB/sec memory bandwidth for the B200 AI GPU.
- AMD Instinct MI300X: 192GB HBM3e @ up to 5.3TB/sec
- NVIDIA H100: 80GB HBM3 @ up to 3.35TB/sec
- NVIDIA H200: 141GB of HBM3e @ up to 4.8TB/sec
- NVIDIA B200: 192GB of HBM3e @ up to 8TB/sec
AMD has the most VRAM on an AI GPU so far, with NVIDIA lagging behind with H100 and 80GB at its limits — unless you’re in China, with access to H100 96GB models — and even the upcoming H200 with 141GB of HBM3e — yeah, it’s HBM3e and has more memory bandwidth, but it’s not as much and not as fast as the Instinct MI300X from AMD.
But, it’s not just pure hardware and VRAM for AI workloads, the actual AI accelerators or AI GPUs need to offer the same performance as NVIDIA’s dominant H100 AI GPUs. Tatarchuk says that there’s a lot of enthusiasm around AMD’s new Instinct MI300X AI accelerators are a great alternative to NVIDIA, but customers aren’t sure if they’ll get the same performance.
He said: “there’s also a lot of ‘we’re not 100% sure if it’s going to be as great as what we’re currently used to on NVIDIA,'” which is true.