New and innovative AI technologies have led to an industry-wide race in accelerated computing and AI infrastructure buildouts. Cloud service providers, enterprise customers, and sovereign nations are deploying AI data centers at a rapid rate, but the high-performance GPUs that drive these data centers come at the expense of tremendous power consumption, which can lead to significant time and output constraints. According to a report by Goldman Sachs, AI data centers are projected to consume 8% of U.S. power by 2030, up from just 3% in 2022.
When factoring in geographic and utility considerations and power demand, deploying a sustainable AI infrastructure is exceedingly difficult for organizations to achieve. Direct liquid cooling (DLC) is critical to ensuring a smooth, sustainable, and efficient data center operation that optimizes performance without draining the power grid.
Why Direct Liquid Cooling?
As demand for high-performance AI accelerates, CPUs and GPUs generate significant heat, threatening data centers and IT operations. A data center must remain cool to run effectively.
Until now, air cooling, which allows cool air to surround the hot CPUs was the answer. Air cooling relies on the temperature and amount of the inlet air passed over the hot chips, and when there are so many servers packed into a data center, it can be difficult for the cool air to circulate properly. Air cooling also requires high-energy usage of computer room air and server fans running constantly, a large contributor to the already significant amount of energy AI data centers consume.
Related:From Skepticism to Standing Room Only: Liquid Cooling’s Breakout Year
DLC, on the other hand, compared to air cooling, is significantly better at removing heat and has a multitude of operational and environmental benefits that make it the superior option for cooling a data center.
Benefits of Direct Liquid Cooling
DLC is well on its way to becoming the default cooling option for data centers. According to global market insights, sales for liquid cooling IT surpassed $2 billion in 2022, with forecasts pointing to a 15% annual growth to $12 billion by 2032.
While there are multiple benefits of DLC, we’ll focus on the top three:
1) Becoming more sustainable
The most appealing attribute of DLC is its sustainability. In an era where organizations fight to fulfill green computing goals and ESG compliance, increasing AI productivity while reducing emissions can feel frivolous. Thus, switching to DLC has been proven to contribute to reducing Scope 2 and Scope 3 carbon emissions due to its energy efficiency and low power consumption. Because less electrical power is required for the grid, DLC can shorten organizations AI factory deployment time, whereas obtaining significant power from the utility can take years.
2) Increased Cost Savings
Among the biggest factors to consider when analyzing cooling options is cost savings. Because DLC does not require fan power and reduces power usage effectiveness (PUE), savings per rack can amount to $60,067 over a three-year period. DLC can also help organizations save up to 40% of the electric utility demand, which is pivotal in geographical areas that are limited in power.
3) Data Center Efficiency
Direct liquid cooling has numerous operational benefits.
For example, when a data center is air-cooled, it’s common for CPUs and GPUs to throttle back performance when reaching or getting close to their maximum operating temperature. While thermal throttling does prevent chip damage, it also reduces the performance of the data center, leading to lower application throughput. DLC allows chips to run at full performance without throttling, mitigating the downtime otherwise seen with air cooling.
Unlock the near future with Direct Liquid Cooling
Direct Liquid Cooling is no longer a distant, abstract technology: It’s here. As AI progresses and data centers embrace this technology, data center providers need to evolve along with it. The redeeming quality is that this evolution can now be done sustainably and efficiently through DLC, ensuring that the future of computing can continue without the hindrance of yesterday’s technology.