Image by By Harikuttan333

The Green Grid: Resources for Free Cooling

Free Air Cooling is Catching on in

The Data Center Industry


It seems that everyone in the data center industry these days is looking to find the most efficient and cheap way to run their facility. It makes sense when you think about it. Why spend more and narrow your profit margin, when you could be actively seeking ways to reduce the cost of keeping your server farm efficiently up and running. With reducing project budgets and more economic pressure, data center management are considering all savings opportunities and free cooling is one of them.

Data center industry leaders are leading this trend.

EBay is currently running their data center, Project Mercury, in Phoenix AZ, at industry high temperatures of 80+ degrees Fahrenheit and is still able to maintain a PUE of 1.046 when the desert sun spikes to a scorching 115+ degrees. Project Mercury uses warm water to cool the facility and in a case study released by The Green Grid, was able to show that this method of high temperature cooling is as efficient as running data centers at cooler temperatures.

Facebook’s Prineville data center in Washington St. and facility in Sweden were both built to use 100 percent “free” cooling and have also both been very successful in keeping low PUEs. Google’s data center in Belgium has no chillers and simply relies on the cold temperatures of the climate to keep their servers cool. The facility only relies on its backup internal air-conditioning on an average of 7 days a year.

As more data centers turn to free cooling methods, it is understandable that The Green Grid would update and release new free air cooling maps and resources. The updated Free Air Cooling Maps take into account new ASHRAE (American Society of Heating, Refrigerating and Air-conditioning Engineers) regulations and now include two new classifications of data centers and have expanded the range of allowable environmental conditions in hopes of encouraging more energy efficient practices. The updates maps are published in a white paper released by the Green Grid and graphically show the potential for free cooling, under ideal conditions, in any location in North America, Europe, and Japan. In addition to the maps The Green Grid has made it simpler for data center managers to calculate the estimated savings of free cooling by launching a webpage that has an easy to use free cooling savings calculator.

So as the data center industry continues to progress towards a more green and efficient way of operating, The Green Grid continues to stay in the forefront of these efforts by publishing and introducing information to help make sure the industry stays on the right path.


White Paper #46 – Updated Air-Side Free Cooling Maps: The Impact of ASHRAE 2011 Allowable Ranges-

The Green Grid- Free-Cooling Estimated Savings Calculator-

The Green Grid- North American Free Air Cooling Low Res Map – ASHRAE Class A2-

The Green Grid- North American Free Air Cooling Low Res Map – ASHRAE Class A3-


Thermal Energy Storage Technology in Data Centers for Cooling Servers

How One Data Center is Using New

Thermal Energy Storage Technology:


Google Embraces Thermal Storage in Taiwan for Cooling Data Center

Google Groundbreaking for New Taiwanese Datacenter using Thermal Energy Storage

As part of Google’s ongoing efforts to cut costs and save power in its custom-built data centers, it will cool the servers in its new Taiwanese computing facility using a technique known as thermal energy storage. The project is part of the company’s ongoing data center infrastructure expansion in Asia, which so far also includes a data center in Singapore, as well as one in Hong Kong.

The company is investing more than US$900m into its Asian expansion. On its website, it said it needed to prepare for the rate of growth in internet use in the region.

“More new Internet users are coming online everyday here in Asia than anywhere else in the world,” As stated on the Google web page for the project. “They are looking for information and entertainment, new business opportunities, and better ways to connect with friends and family near and far. We’re building a data center to make sure that our users in Taiwan and across Asia can do just that.”

“What makes Asia unique is the fact that it’s a region of mobile-first,” Daniel Alegre, Google’s president for Asia- Pacific, said in a phone interview with Data Center Knowledge. “YouTube, for instance, is becoming a very large component of mobile usage.”

The company bought 15 hectares of land for the data center in Changhua County, Taiwan, in September 2011. Its long-term investment in the facility is estimated to exceed $300m. Each of the other two projects in Asia is expected to cost about the same.

Google expects to bring all three Asian data centers online in 2013.

Using the thermal energy storage technique the company can run its air conditioning systems at night when electricity rates are lower, cooling insulated tanks filed with ice or liquid coolant that can then be used to dissipate heat in data centers when ambient temperatures rise during the day. Although not the first data center to employ the idea of thermal energy storage, i/o Data Centers, for example, already uses it along with solar arrays in its Phoenix ONE data center, it will be the first of Google’s facilities to use the new technology.

Interestingly, Google’s approach to cooling is quite opposite to that of Ebay’s. As you might recall, Ebay is currently running it’s Project Mercury data center at record high temperatures of above 80°F. It seems that the industry is still extremely divided on what the most efficient facility running temperatures should be, but only time will tell which are trends and which become standards.

“We’re very excited to be building this data center in Taiwan,” said Lee-Feng Chien, Managing Director, Google Taiwan. “We’re working as fast as we can to start bringing it online by the second half of 2013, so we can keep up with the rapid growth in capacity demand across the region, and to hire the team of around 25 full time Googlers that will manage the facility when it’s fully operational.

“I’m also happy to be able to confirm that this will be one of the most efficient and environmentally friendly data centers in Asia,” Chien continued. “Part of this will come from our nighttime cooling and thermal energy storage system – not a revolutionary idea, but the first of its kind in our global data center fleet. But we’re also custom designing each element of the facility – adapting for the local environment some of the design features our engineers have developed and continue to innovate on in our data centers around the world that have enabled us to use 50% less energy than typical facilities.”

Typical to the data center industry, this may be a trend that lags to catch on, but if and when it does, it has potential to change cooling efficiency in data centers.

What do you think about thermal energy storage technology? Do you think it’s here to stay or just another industry phase?

Data Center Spotlight: EBay – Project Mercury

Spotlight Data center:

EBay – Project Mercury

Constructing an Ebay Data Center in Phoenix, AZ

Why Phoenix? Why would anyone build a data center in the middle of a desert that reaches temperatures of up to 119F°? And how in the world are they going to cool it efficiently? I know these were the first questions that popped into my mind when I first heard about EBay’s new Project Mercury data center, and my guess is you were probably thinking the same things.

EBay is currently in the midst of a major overhaul; moving away from renting and leasing facilities and into owning their own, maximizing their ability to better control efficiency. With the consolidation of 11 data centers into just 3 locations, all the while deploying tens of thousands of severs within a six-month time frame, came a massive challenge for the company.

“We needed to meet an organizational demand to consolidate data centers and deploy tens of thousands of new servers in under six months,” Dean Nelson, senior director for global foundation services at eBay, said in a statement. “As part of that process, we wanted to push the envelope in terms of what had been achieved in data center efficiency to date. EBay was able to achieve both goals because we took a metrics-based approach to the process, which included ensuring that our server and data center RFP processes used PUE and TCO to optimize our supply chain, taking a modular, scale-able, and forward thinking data center design, and aligning ourselves with The Green Grid’s DCMM from the beginning.”

Since the launch of Project Mercury, EBay has relied heavily on its partnership with The Green Grid to assist in the strategy and development for their newest data center. The Green Grid has since released a case study that outlines the innovations and approaches used in the facility in hopes of sharing the resources learned with the industry.

Containers on the roof of Ebay's Datacenter

You’re probably still wondering why Phoenix was chosen as the location for this project and not somewhere frigid and cold. Well Project Mercury was born out of necessity, not an attempt to push limits or prove a point. EBay simply built the data center in Arizona because it was needed; and with that understanding began the massive undertaking of building an efficient data center in an oven like environment. “If we can solve this in a desert,” says Dean Nelson, eBay’s senior director of global foundation services and the man who oversaw the project, “we can solve this almost anywhere.”

So how did they do it, you ask? The solution was to cool the facility with hot water. EBay embraces the idea of running some of its servers at higher temperatures than the industry standard, insisting that servers can run just fine at temperatures above 80°F. The water held in their outdoor tank can reach 87°F on the hottest days during the summer, but even this staggeringly warm temperature is cool enough to run the heat exchanges that keep the facility’s servers at operable temperatures.

With a PUE of 1.35 when there’s a 30-40% load on the servers, EBay has little to brag about. However, when the containers are considered individually, their PUE was a low 1.018 in January and only spiked to 1.046 when outside temperatures rose to 115°F in the middle of August. These numbers are incredibly low given the environment that they are placed in. Interestingly, up to 12 containers can sit outside on the roof in the direct Arizona sunshine and the data center is still able to operate efficiently.

Project Mercury hits home for two reasons: the first being more blatantly obvious, in the fact that the data center is located in our backyard, which is incredibly unusual given our extreme climate, and the second being that the facility uses ServerLIFTs to assist in the moving of their IT equipment. Both of these reasons make us equally excited and we are very happy to welcome EBay’s green data center to their new hot home.