Micron teases tall form factor 256GB DDR5-8800 memory sticks for next-gen servers

Micron teased some incredible new 256GB MCR DIMM memory modules at NVIDIA’s recent GPU Technology Conference (GTC) last week.

Micron teases tall form factor 256GB DDR5-8800 memory sticks for next-gen servers 901

VIEW GALLERY – 2 IMAGES

The new Micron 256GB DDR5-8800 MCRDIMMs have been designed for next-generation servers, and being at GTC 2024, you’d think they’re being aimed at AI systems and servers, including Intel’s new Xeon Scalable “Granite Rapids” processors, with Micron already sampling the new 256GB modules to customers.

Micron’s new 256GB MCRDIMM in DDR5-8800 spec at GTC, as pictured by Tom’s Hardware, was in tall form factor, but the company plans to offer its new MCRDIMMs in standard height for 1U servers and more. Micron is using monolithic 32Gb DDR5 ICs, but the tall form factor module uses 2-Hi stacked packages, meaning it will run a little hotter because there’s less space for thermal dissipation.

The tall Micron module in 256GB form uses around 20W of power, which is twice the power that Micron’s 128GB DDR5-8000 RDIMM memory module consumes, 10W at DDR5-4800 speeds.

MCR stands for Multiplexer Combined Ranks, which are a style of dual-rank memory module that allow both ranks to operate concurrently, using a specialized buffer. The buffer allows the two physical ranks to act as if they were two separate modules, but working together in parallel, which leads to a doubling in performance by enabling the simultaneous retrival of 128 bytes of data from both ranks per clock, which doubles the performance of a single module.

The buffer also features its own host memory controller on the DDR5 protocol, at speeds faster than the specific standard — which is DDR5-8800 with Micron’s new 256GB module — impressive stuff. Tom’s Hardware adds that “Typically, modules with two physical ranks function as a single module, meaning that when the host CPU (or memory controller) retrieves data from such a module, it is limited to fetching 64 bytes of data at a time. MCRDIMMs double that, thus substantially increasing per-module capacity and performance”.

Micron’s new 256GB MCRDIMMs should be found inside of future AI servers based on Intel’s new Xeon Scalable “Granite Rapids” processors, as these types of machines feature oodles of memory for training. 256GB DIMMs inside of a 12-channel system that support two modules per channel, meaning a Granite Rapids-powered AI server could feature 3TB of DDR5 memory using 12 slots, of a huge 6TB of DDR5 memory using 24 slots.