Decoding Data Center Energy Consumption

Data centers power everyday life, and their energy use is rising fast. Scott Smith explains what drives data center energy consumption, why cooling matters and how smarter systems cut costs and risk.

Data center

When people ask what I do as the director of Mission Critical Offerings, the conversation often turns to data centers. Most know these facilities exist somewhere “out there,” quietly supporting everything from Spotify playlists to fraud detection in financial systems. Fewer realize how central they’ve become to daily life — or the scale of the energy and engineering required to keep them running.

Data center energy consumption accounts for over 4% of total U.S. power use, according to the U.S. Department of Energy. That number keeps climbing, and much of the money spent on powering data centers goes to cooling those high-powered computers. Understanding what drives data center energy consumption, and why cooling and reliability matter so much, starts with understanding the very nature of a data center. 

What is a data center? 

At its heart, a data center is a facility that houses hundreds, thousands or even tens of thousands of servers. Think of them as supercharged computers. These servers are responsible for “computational horsepower” — the ability to process complex transactions, store vast amounts of data, and keep the digital world spinning. When you upload a photo to the cloud, get a recommendation from an AI chatbot, or stream a movie online, your request is being processed somewhere inside a data center. 

Before, we used to conduct relatively simple searches and transactions. Now, with the meteoric rise of AI and machine learning, the complexity and demand for computational power have exploded. Much of this new digital intelligence depends on vast “computational horsepower,” and that horsepower generates heat. 

Cooling the Cloud: Innovation at the Heart of Data Centers

This episode features Scott Smith and Dr. Dereje Agonafer, Professor of Mechanical and Aerospace Engineering at the University of Texas at Arlington.

Learn more and watch the podcast
Healthy Spaces S5E9 Cooling the Cloud

The growing energy burden of data centers as digital infrastructure

Chips inside server racks can execute billions of calculations per second. This processing also creates heat, and as chip densities and computational needs rise, so does the energy consumed and heat produced. If this heat isn’t effectively managed, chips can overheat and shut down, threatening the reliability of digital services for millions of users. 

For small colocation providers, with fewer backup options, the impact of a shutdown can be catastrophic. Which is why, in our world, uptime and reliability are paramount. 

Why cooling and energy efficiency matter for high performance computing

At its core function, cooling equipment ensures that the heat generated by high-density chips is pulled away, so that chips operate at optimal and designed levels. Traditionally, data centers relied on air cooling — fans blowing air over chips and moving that air out of the data hall. But, as chip densities have increased in AI data centers, the need for better cooling solutions to increase energy efficiency in data centers has grown, ushering in the age of liquid cooling and direct-to-chip solutions. 

Liquid cooling works more like a cold shower compared to opening a window in your house — much more effective at rapidly removing heat. By pressing a cold plate directly onto the chip and using water or specialized fluids to carry heat away, we’re able to keep more powerful processors running at their intended capacity. On the horizon, even more advanced “two-phase” cooling systems that rely on refrigerants and phase change, are moving toward the mainstream as chip densities continue to rise. 

Looking forward, immersion cooling and other novel approaches are being developed to meet the relentless growth in data center energy consumption. But, the fundamental challenge remains the same: Keeping digital workloads running smoothly, safely and efficiently. 

Beyond cooling efficiency

In addition to reducing data center energy consumption through improved controls and thermal energy storage, there’s momentum building around data center sustainability through waste heat recovery. The concept itself is simple (and not new): instead of simply wasting heat (which is also energy), data centers can reuse it for free cooling or share it with neighboring communities or industrial processes, feeding district heating systems and turning a liability into community value.

As population density rises around data centers, and regulatory expectations tighten around the world, innovation in capturing and managing waste heat will be critical. 

What’s next in the near-term for data centers

We can’t predict everything, but two things seem certain. First, the need for more reliable, scalable and sustainable data center cooling solutions is only going to accelerate as data center sizes and capacities continue to expand. We’re also seeing unprecedented growth not just in computing power, but in the equipment designed to support these massive operations. For example, where air-cooled chillers were once commonly sized at around two megawatts, we’ve recently announced a three megawatt (about 850 ton) air-cooled chiller — truly transformational in the industry. On the water-cooled side, chillers have grown to a staggering 18 to 20 megawatts in capacity. 

And it’s not only chillers — CDUs (coolant distribution units) used for direct-to-chip liquid cooling are scaling up as well. The industry norm was previously one megawatt, but now solutions are arriving that can be scaled up to ten megawatts to match ever-growing data hall densities. 

Second, the boundary between data centers and the communities they serve will continue to blur in ways that support not just uptime, but also data center sustainability and societal benefit. As technology continues to advance, staying ahead means innovating in both cooling technology, and in how we think about the relationship between these critical facilities and the world around them. 

As the world becomes more digital and interconnected, data centers are becoming both engineering marvels and essential infrastructure. The challenge — and the opportunity — is to ensure innovative cooling and data center energy strategies keep pace with their explosive growth, delivering reliable, sustainable and valuable computing for all. 

Thought Leaders

Scott Tew

Global Head and VP, Sustainability Strategy, Trane Technologies

Holly Paeper

President, Commercial HVAC Americas, Trane Technologies

Latest Article

Building a Movement
Carrie Ruddy

Senior Vice President and Chief Communications and Marketing Officer

Jose La Loggia

Jose La Loggia, Group President, EMEA

Donny Simmons

Group President, Americas, Trane Technologies

Riaz Raihan

Senior Vice President and Chief Digital Officer

Mairéad Magner

Senior Vice President and Chief Human Resources Officer, Trane Technologies

Latest Article

Culture of Impact
Karin De Bondt

Senior Vice President and Chief Strategy Officer, Trane Technologies

Chris Kuehn

Executive Vice President and Chief Financial Officer, Trane Technologies

Latest Article

Sustainability for Growth
Mauro J. Atalla

Senior Vice President and Chief Technology and Sustainability Officer, Trane Technologies

Emily Vesling

Director of Sustainability, Trane Technologies

Jenelle Shapiro

Sustainability and Circularity Leader, Trane Technologies

Oakley Roberts

Vice President of Product Management, Trane Technologies

Paul Camuti

Former Executive Vice President and Chief Technology and Sustainability Officer, Trane Technologies