Liquid cooling technology isn’t new; it’s been around for over a century. But its application in data centers, especially with the rise of AI, is a recent trend that’s picking up speed. As demand for computing power grows, liquid cooling helps keep servers cool during heavy data processing.
When electricity flows through conductors, it generates heat. More power means more heat, and today’s AI workloads, which rely on expensive GPUs, require significant power. For instance, Nvidia’s GPUs have seen a sharp rise in energy consumption. The A100 used about 400 watts, while the latest H100 consumes 700 watts, and upcoming models are set to exceed 1,000 watts.
Laura Musgrave, a leading researcher in responsible AI at BJSS, highlights the urgent need to address this energy challenge. According to the International Energy Agency, data center energy demands could double by 2026, driven primarily by AI technologies. Musgrave points out that we need to rethink our approach to cooling systems for AI hardware.
Several methods exist for cooling in data centers. One approach is immersion cooling, which submerges hardware in a non-conductive liquid. Another involves rear-door heat exchangers that channel coolant through the back of servers. However, the most effective method for cooling high-output GPUs appears to be direct-to-chip cooling. This method involves a liquid coolant flowing directly to a cold plate attached to the GPU, efficiently dissipating heat.
The liquid cooling market, especially for AI systems, is evolving rapidly. Vlad Galabov from Omdia notes that no single vendor has a monopoly; the market is expanding rapidly for everyone. Three years ago, only about 7% of data centers used liquid cooling. Now, that figure has jumped to around 22%, according to Jet Cool’s CEO, Bernie Malouin.
There are now over a dozen significant players in the liquid cooling market, alongside startups. Yet Joe Capes from LiquidStack argues that current offerings still don’t meet market demand. He describes this as a historic moment in the industry, with changes happening almost weekly.
New companies are also emerging with innovative cooling solutions. For example, Jet Cool employs microconvective cooling, using jets to target hot spots, while LiquidStack provides both immersion and direct-to-chip cooling options. Many major manufacturers like Dell and HPE are developing their own customized liquid cooling solutions to fit specific needs.
While liquid cooling can help alleviate some power consumption issues, concerns about overall energy use remain. Capes warns that predictions show data center energy consumption could triple in the next decade.
Amid all this, some companies are focusing on preventing heat generation altogether. Daanaa Resolution is testing a power transaction unit designed to optimize energy transfer and reduce waste, avoiding the inefficiencies of traditional power conversion. CEO Udi Daon emphasizes that his technology isn’t a competitor to liquid cooling but rather part of a broader push for more efficient data centers.
Today’s focus in the liquid cooling sector is less about sustainability and more about enhancing AI performance and accommodating more GPUs. At a recent AI conference, former Google CEO Eric Schmidt expressed skepticism about achieving climate goals, framing AI as a potential problem-solver rather than a limitation.
Analysts like Steven Dickens point out that while data centers were becoming more efficient, the introduction of GPUs has reversed that trend. To get back on track, he argues for substantial investment in technologies like liquid cooling.
There’s also a call for better consumer education around energy consumption. A proposed AI energy rating system could provide insights into efficiency, helping users make informed decisions about AI tools.
In the end, addressing energy consumption is crucial. Companies and consumers alike must prioritize their energy use, particularly in regions facing limited power availability and tough choices about resource allocation.