How Data Center Energy Efficiency Promotes Savings
Energy costs contribute significantly to the overall cost of running a data center, so it stands to reason that not understanding data center energy efficiency or having an effective plan in place can lead to unnecessary costs and wasted resources.
Taking steps to improve energy efficiency in data centers is not just important from a financial standpoint, but from an operations perspective as well. Afterall, creating a plan that revolves around energy efficiency can result in:
- Optimized spend
- Resource savings
- Better business continuity management and reliability
Below, we’ll cover what data center energy efficiency is, why it’s important, and ways data centers can improve their energy efficiency.
The energy demand stemming from data centers is large and rising. According to the U.S. Department of Energy, about 2% of total U.S. electricity consumption comes from data centers, and we can expect to see that grow with the increased popularity and need for high-performance computing, unless energy efficiency measures can keep up.
Server workloads and demand can vary by use case, but in general, a server will use between 1,800 and 1,900 kilowatt hours (kWh) per year. Data centers, in general, consume approximately 1,000 kWh per square meter. To put these numbers into perspective, this figure is equivalent to what the power consumption would be in 10 average American homes per square meter.
Data center efficiency is all about taking these numbers and bringing them down wherever possible, without sacrificing performance or threatening the safety and reliability of the IT equipment inside the data center.
What Consumes the Most Energy in a Data Center?
Servers and cooling systems are the largest culprits for energy consumption in a data center. Servers tend to take the lead, with cooling systems coming in at a close second. The rest of the energy use comes from storage devices, power distribution systems, and networking equipment.
To process data, servers need to run powerful processors and other components. Data center cooling systems must move air and water around to prevent servers from overheating. These intense tasks place a huge demand on energy resources.
Why is Data Center Energy Efficiency Important?
As previously mentioned, data centers are large consumers of energy. Any measures that organizations can take to make data centers more energy-efficient can reduce the global impact and save on the costs of running the facility.
Energy efficiency measures can also make data centers more reliable. Power outages and other supply disruptions can impact data center operations in a big way. Data centers can be more resilient to disruptions when power usage is optimized. The important thing to remember is you don’t have to take on all energy efficiency projects at the same time. Making incremental moves to boost data center energy efficiency adds up. Consolidating even one server can save up to $2,500 in annual costs, including $500 per year in energy costs. Managing airflow in a large facility can reduce the overall energy bill by $360,000. Even the smallest changes make cumulative, notable differences.
Estimating Data Center Energy Consumption
There are two main ways to estimate the energy consumption at a data center: bottom-up models and extrapolation models.
A bottom-up model is a more accurate, but time-consuming, method to estimate energy consumption. To do this, you need to add up the energy consumed by each piece of equipment in the data center. This can be more data-intensive and be more work for the facility.
An easier way to make this estimation is by using extrapolation-based models, which use historical data on energy consumption to predict future trends. For organizations looking for an easy entry point to understanding what their consumption looks like, an extrapolation-based model is a good way to go, as it takes less time to implement and requires less data. Once historical data is collected, businesses can use regression analysis to predict future energy demands.
Tips to Improve Data Center Energy Efficiency
Data center operators can improve the efficiency, and therefore overall costs, of their facilities by making one or more of the following adjustments.
Experiment with Temperature
According to AKCP, every 1-degree Fahrenheit increase in server inlet temperatures (the temperature of the air entering the server) can bring down energy costs by 4-5%. Making a more significant move, like 64.4°F to 75.2°F, can bring down operating costs by 43%. Cooling accounts for a significant portion of data center energy consumption, so every degree makes a huge difference.
Servers can also be arranged in a hot aisle/cold aisle layout to improve cooling efficiency. In this layout, the hot air that is exhausted by the servers is contained in a separate aisle away from the cold air being used to cool the servers. Hot aisle/cold aisle layouts can reduce fan energy use somewhere between 20 and 25%.
Enclose or Contain Your Server Racks
One way you can separate your server racks is by using a hot aisle/cold aisle layout, but other options are available. Data centers that enclose or contain their server racks use physical barriers to keep the hot air exhaust away from cold air intake. Some methods that might be used include solid or perforated rack doors, overhead shrouds, or aisle containment panels.
Instead of having fans running the same speed all the time, variable-speed fans adjust to the cooling needs of the data center. Matching cooling more accurately can lower energy costs during idle or low-demand periods.
Multiple servers can run on a single physical server when virtualization is used. Reducing the number of physical servers in the data center can save on energy, but you’ll also want to avoid:
- Over-provisioning: Putting too many VMs on one server
- Sprawl: Uncontrolled creation and proliferation of VMs
- Lapses in monitoring: Failing to keep an eye on the performance and consumption of virtualized servers
Regularly monitoring the energy consumption of a data center can make it much easier to recognize unexpected increases in cost. It can also help facility managers identify opportunities for energy savings.
Data Center Environment Monitoring
In addition to keeping tabs on energy consumption, data centers should be monitoring environmental metrics. Temperature, airflow, and humidity data can be used to ensure the data center is operating as expected and not using too many environmental resources than needed.
Some data center managers may choose to use a specific metric that measures data center power efficiency, known as power usage effectiveness (PUE). This can be calculated by taking the total energy consumption of a data center and dividing it by how much power the IT equipment consumes. What a “good” PUE will look like will be dependent on what is housed in the data center, but the ideal PUE would be 1.0, where 100% of all energy consumed is used by the IT equipment. While this isn’t realistic, it’s good to look at a PUE under 1.5 as good and one over 1.5 as missing the mark.
Reduce Dependency on Cooling
Data centers can reduce their dependency on cooling by using all of the previously mentioned measures, but there are also different forms of cooling that can prove to be more energy efficient. For example, liquid cooling is more efficient than air cooling, but implementation can be more complicated and expensive.
Free cooling takes advantage of the outside air to cool down the data center through evaporative cooling towards or through direct venting. However, this is only possible after setting up evaporative cooling or when located in a cold enough climate.
Facilities can also use a geo-exchange system, which uses the constant temperature of the earth to control the temperature in the data center. In the summer, warm air passes over the pipes, transferring heat to the fluid within. This fluid circulates through pipes in the ground, dissipating the heat.
In the winter, the opposite happens – the heat from the ground is absorbed in the pipes, which then circulates to the data center and heats up the air. Implementing a geo-exchange system can make cooling more efficient, but is another project that requires often expensive installation.
What happens if the air that is supposed to cool servers misses them partially or completely? This is called airflow bypass. Gaps in the server racks or improperly installed rack spacer blanks can contribute to airflow bypass. Making sure air is being sent to servers effectively can improve data center cooling efficiency.
If your servers have started to sprawl out, consolidating your data center footprint can help you distribute power and cool your IT equipment in a more efficient manner. Virtualizing, upgrading equipment, or changing the configuration of your servers can all be part of the consolidation process. This can also help reduce your carbon emissions.
Enhance Data Center Efficiency With High-Density Colocation
One of the greatest challenges data centers have to face today is keeping energy efficiency in check while catering to the demands of high-performance compute and other equipment that supports AI and GPU-intensive applications. High-density colocation can support these high-density and high-powered computing workloads because they already have efficient cooling systems, power-efficient infrastructure, and optimization built into their framework.
TierPoint offers high-density and ultra-high-density colocation to fit the ever-evolving needs of clients who are currently expanding their own capabilities in high-performance computing and GPU colocation, including AI workloads. Our industry-leading cabinets result in our ability to be more efficient with space and optimize power and cooling resources. Contact us to learn how we can accomodate your HPC and AI workloads in a high-density data center.
More >> How Data Center Energy Efficiency Promotes Savings