Spotlight Data center:
EBay – Project Mercury
Why Phoenix? Why would anyone build a data center in the middle of a desert that reaches temperatures of up to 119F°? And how in the world are they going to cool it efficiently? I know these were the first questions that popped into my mind when I first heard about EBay’s new Project Mercury data center, and my guess is you were probably thinking the same things.
EBay is currently in the midst of a major overhaul; moving away from renting and leasing facilities and into owning their own, maximizing their ability to better control efficiency. With the consolidation of 11 data centers into just 3 locations, all the while deploying tens of thousands of severs within a six-month time frame, came a massive challenge for the company.
“We needed to meet an organizational demand to consolidate data centers and deploy tens of thousands of new servers in under six months,” Dean Nelson, senior director for global foundation services at eBay, said in a statement. “As part of that process, we wanted to push the envelope in terms of what had been achieved in data center efficiency to date. EBay was able to achieve both goals because we took a metrics-based approach to the process, which included ensuring that our server and data center RFP processes used PUE and TCO to optimize our supply chain, taking a modular, scale-able, and forward thinking data center design, and aligning ourselves with The Green Grid’s DCMM from the beginning.”
Since the launch of Project Mercury, EBay has relied heavily on its partnership with The Green Grid to assist in the strategy and development for their newest data center. The Green Grid has since released a case study that outlines the innovations and approaches used in the facility in hopes of sharing the resources learned with the industry.
You’re probably still wondering why Phoenix was chosen as the location for this project and not somewhere frigid and cold. Well Project Mercury was born out of necessity, not an attempt to push limits or prove a point. EBay simply built the data center in Arizona because it was needed; and with that understanding began the massive undertaking of building an efficient data center in an oven like environment. “If we can solve this in a desert,” says Dean Nelson, eBay’s senior director of global foundation services and the man who oversaw the project, “we can solve this almost anywhere.”
So how did they do it, you ask? The solution was to cool the facility with hot water. EBay embraces the idea of running some of its servers at higher temperatures than the industry standard, insisting that servers can run just fine at temperatures above 80°F. The water held in their outdoor tank can reach 87°F on the hottest days during the summer, but even this staggeringly warm temperature is cool enough to run the heat exchanges that keep the facility’s servers at operable temperatures.
With a PUE of 1.35 when there’s a 30-40% load on the servers, EBay has little to brag about. However, when the containers are considered individually, their PUE was a low 1.018 in January and only spiked to 1.046 when outside temperatures rose to 115°F in the middle of August. These numbers are incredibly low given the environment that they are placed in. Interestingly, up to 12 containers can sit outside on the roof in the direct Arizona sunshine and the data center is still able to operate efficiently.
Project Mercury hits home for two reasons: the first being more blatantly obvious, in the fact that the data center is located in our backyard, which is incredibly unusual given our extreme climate, and the second being that the facility uses ServerLIFTs to assist in the moving of their IT equipment. Both of these reasons make us equally excited and we are very happy to welcome EBay’s green data center to their new hot home.