As GPUs and AI accelerators push beyond one kilowatt of power consumption, many systems builders are turning to liquid cooling to manage the heat. How

LiquidStack says its new CDU can chill more than 1MW of AI compute

submited by
Style Pass
2024-09-02 14:00:07

As GPUs and AI accelerators push beyond one kilowatt of power consumption, many systems builders are turning to liquid cooling to manage the heat. However, these systems still rely on complex networks of plumbing, manifolds, and coolant distribution units (CDUs) to make it all work.

On Thursday, LiquidStack, best known for its immersion cooling tech, launched a monster CDU with more than one megawatt of cooling capacity that it claims can plug into any existing platform using direct-to-chip liquid cooling.

CDUs are responsible for pumping the coolant to and from connected systems and racks exchanging the captured heat via a liquid-to-air, or in this case, liquid-to-liquid heat exchanger.

While a megawatt of capacity might seem like an obscene amount of cooling, when it comes to cooling modern AI systems, it doesn't go as far as you might think. Next-gen systems from companies like Nvidia will cram somewhere between 5.4kW and 5.7kW of compute in a single RU server and upwards of 120kW in a single rack.

At 1.35 megawatts of cooling capacity, LiquidStack's first CDU has enough capacity to cool an entire Nvidia DGX GB200 Superpod with its 288 2,700W Grace-Blackwell Superchips with room to spare. If you're curious about that system, you can find our deep dive here.

Leave a Comment