Data Center Power Needs Will Continue to Increase

Data centers have been facing a conundrum for the past several years. On one hand, they must supply services to a rapidly increasing number of consumers, while the technology those consumers use becomes exponentially more complex and demanding. But on the other hand, because data centers are responsible for using such a massive amount of power, they’ve been tasked with reducing their energy usage to minimize the demand on power distribution centers. 

Given the choice between lowering computing capacity or trying to go for energy efficiency, data centers obviously prefer the latter. But this plan comes with several challenges. 

It’s a problem that has not yet been solved, and, according to most predictions, won’t be solved anytime soon. Let’s take a look at why data center power consumption will probably increase rather than decrease in the near future. 

Complex technologies are surging in popularity.

Advanced technologies such as AI, IoT, 5G and cryptocurrency mining are becoming more commonplace. Previously, only large corporations had expansive IoT networks and only specialized Bitcoin miners used machines capable of crunching endless code. But now anyone who wants to try these things out can find a convenient entry point into the technology. 

Small businesses now have IoT networks, crypto miners set up powerful mining rigs in their own homes, and most major cities have 5G networks freely available. Not to mention, AI has become the backbone of digital life for many companies and individuals, enabling things such as intelligent document processing, machine learning, computer vision, and other technologies that require more “brain power” than a typical computer program can offer. 

New digital connections are occurring around the world.

While many countries already have a strong digital presence, new and emerging digital economies are constantly forming and expanding as companies in those locations establish digital transactions and other types of online interactions. Some may move their employee information to the cloud, set up an IoT network for the first time, or build software that requires cloud processing. 

With every new connection comes more demand for storage and power, increasing the load on data centers. 

Companies just can’t get enough of harvesting and analyzing data.

Information is powerful, and no company wants to be left behind. Companies today harvest data from their customers, equipment, employees, and even their own business processes. The belief is that everything has room for optimization and improvement if only you can collect enough information to tell you what’s really going on behind the scenes. 

The scale of this big data trend is a significant contributing factor to the power demands of data center operations. 

Consider this: if you take every single piece of data in the world and lump it all together—an incomprehensible amount of data—a full 50 percent of that data was created sometime within the past two years. The rate of expansion is staggering.

And every single data point requires several processes that all draw power. First, the data point must be collected. This can be done with a device, such as with an IoT sensor taking temperature readings. A customer filling out a feedback survey or an employee’s computer logging active work time are a couple of other common examples. 

Once the data has been collected, it must be stored. Many companies use storage servers inside data centers as the vaults for the innumerable data points that flood their systems every second of every day. 

Because data sitting untouched on a drive isn’t very helpful, the data must then be analyzed to see what kind of context it can bring to any given situation. The problem is that the vast amount of data stored by most companies simply cannot be interpreted by human beings. No person or group of people would ever be able to read every data point, let alone make sense of them all. So, most data analysis involves some form of AI, which has large power demands of its own. 

As data centers support most or all of the steps in this process, it’s the data center’s energy source—rather than the customer’s local grid—taking the brunt of the impact. 

Massive workloads require lots of power—and cooling. 

The more power a computer churns through, the hotter it will run. Electricity is energy, after all. Much of it gets converted to heat as servers crunch numbers and run advanced AI programs. 

If you operate a data center, you know this all too well. Do you ever feel as if you’re constantly playing a game of catch-up with your cooling infrastructure? As soon as you install new data center cooling systems to handle the immense heat generated by all of that powerful hardware, the hardware gets more powerful and you have to cram more of it into a smaller footprint within your facility. 

Liquid cooling is currently the best option for many data centers in terms of energy efficiency. This includes higher-end options such as immersion cooling and liquid-cooled heat sinks on individual chips that run particularly hot. If you’re still relying on primarily an air-cooling system and you want to improve efficiency within your own center, making the switch to liquid should be a top priority move. 

Computing efficiency is increasing. Just not fast enough to keep up.

Data centers can implement ways to increase computing and energy efficiency. For example, strategies such as switching to more energy-efficient hardware and infrastructure, have, so far, managed to keep the increase in energy usage much less steep than the increase in workload would have generated if no effort was made to reduce power consumption. 

It’s not that efforts toward energy efficiency aren’t working; they are very effective. The issue is that the combined challenges exceed our current ability to mitigate growing power demands. 

What can you do as a data center operator?

The good news is that there are a few promising strategies already in place or on the horizon. While these concepts may not be full solutions—they won’t improve energy efficiency enough to reduce total power usage—they help data centers handle increasing workloads without proportionally raising power demands. These strategies can help you flatten the curve: 

  • Using edge computing to handle some of the problem-solving done by AI and machine learning
  • Upgrading to more efficient cooling systems, such as switching from air to liquid cooling or installing immersion coolers in your facility
  • Staying up-to-date on the most energy efficient hardware, especially when it comes to CPUs and optical interconnects

If all else fails, you could always try what Meta and Microsoft have successfully done and pick a new data center location in a cooler climate to offset your cooling needs. You probably don’t want to go to the lengths that Microsoft did, lowering hermetically sealed servers into the ocean for a continuous supply of liquid cooling, but it is feasible to choose a location where you have a steady power source nearby and the grid is not overly taxed already. 

Despite best efforts, projections still have data center power usage increasing by somewhere between two to seven times before 2030. In light of the global push toward renewable energy and “green energy” and away from fossil fuels, it’s understandable that many data center operators aren’t quite sure how to proceed. Some data centers are moving toward locations where hydroelectric or other renewable energy sources can be used to power their facilities, at least in part. 

Globally, data centers still use less than 10 percent of the world’s power. But the rate at which data centers are growing means that share of the total power will grow as well. The weight doesn’t just fall on data centers to lessen their energy impact; semiconductor developers are also working to develop more energy-efficient chips. It’s unclear how much the global semiconductor shortage may be hampering our ability to use the newest chipsets to save energy, but it’s almost certainly playing a part in the struggle. 

Until conditions change or some novel innovation brings new solutions to the table—we’ve heard good things about photonic computing as a relatively heatless solution, but mass adoption seems far in the future—these strategies might be the best most of us can do.

Recommended Posts

enter the information below to download the whitepaper

The Data Center Migration Guide

enter the information below to download the whitepaper

The Data Center Safety Guidebook

enter the information below to download the whitepaper

Best Practices for Moving IT Department in the Data Center

enter the information below to download the whitepaper

Best Practices for Data Center Equipment Handling

enter the information below to download the whitepaper

data center consolidation action plan white paper

enter the information below to download the whitepaper

Buying a Data Center Lifting Device