Data Center Spotlight: Interxion, London

Spotlight Datacenter: Interxion, London

Would you sleep at work?


As England prepares for the 2012 Summer Olympic Games, one company thinks they may have found an innovative way to beat the congestion that is about to clog London’s travel and road networks.

Interxion, a colocation data center, located in the heart of the city, has come up with a solution that will ensure its tech staff will be available 24/7. What you ask is so ingenious that it can guarantee that employees will be at work on time, at all times? A hover craft to fly over the bumper to bumper traffic? A special car designated specially for Interxion employees in the Tube? No, Interxion has conveniently cut out all issues created by the massive traffic and transportation headache that the Olympics will bring. It has, very creatively and resourcefully, installed sleeping pods in its data center, effectively ensuring that its techs never have to see the outside world or the light of day again. No late workers, no downtime issues, no stress.

Interxion does worry about how the games will affect their data center operation and uptime. It plays an integral part in London’s digital world, housing online services such as digital businesses, social media, entertainment, online retail and banking, and education. Millions of people across the UK rely on Interxion’s service delivery, so any downtime can be detrimental.

As Greg McCulloch, UK Managing Director, Interxion commented in a press release this week, “Due to the nature of our business we need to be ready for all eventualities and while we are excited to have the Olympics in London we also need to be sure that we can continue to offer the highest level of resilience to our customers. The installation of the sleeping pods is another great example of Interxion putting resilience and uptime at the forefront of everything it does.”

The pods themselves were created by the UK-based company Podtime and are modular in design. Although originally intended for the use of “power naps”, Interxion has thought of that issue as well and has had its 10 pods retrofitted for overnight stays. Said to be comfortable, private, and secure for the user, these pods have TVs, radio with earphones, laptop holders, and memory foam mattresses. How comfortable they actually are is yet to be known.

Podtime Tour:

“Everyone working in central London will feel the challenging side-effect of the Games, and the organizations they work for will face operational issues with employees being stuck in queues rather than being in the office,” said Jon Gray, Director at Podtime. “We see the sleeping pods as a good cheap solution for those ‘staff-critical’ companies which must have 24/7 cover for vital procedures… our pods offer comfort, privacy and security at a reasonable price.”

So as London gears up for the 2012 Olympic games, it seems that Interxion is one step ahead of its competitors in ensuring that nothing will go wrong in its data center. But as innovative as their pod strategy is, would you be willing to sleep at work?

Photo Courtesy of Interxion:

By NapoliRoma (Own work) [CC-BY-SA-3.0 (], via Wikimedia Commons

Modular Data Center Designs, More Than Just a Trend?

Modular Data center Designs:

It’s been more than five years since the unveiling of Sun’s Blackbox data center and modular designs are becoming more synonymous in the industry and with high-performance and cloud computing. Running servers in shipping containers was initially viewed as a niche play by many in the data center industry, limited to mobile requirements, temporary capacity, or novel designs like cloud computing facilities. But with enterprise users such as: HP, Dell’s Data Center Solutions Group, and eBay, having publicly announced an adoption of modular data center design into their data centers, it’s been affirmed that this is definitely more than just a trend. With the massive growth in the data center industry and a huge push towards cost effective planning and operation, modular designs are expected to become a viable alternative for expansion and new builds.


The idea of modular data center designs has evolved from the basic premise of using an ISO standard for a shipping container and customizing it to accommodate data center infrastructure. If designed appropriately, extreme levels of performance could be attained in data centers using a consistent design technique, and capital costs could be reduced by standardizing components, construction and the supply chain. In the past few years the modular design approach has since split in two directions. While some vendors have focused on the overall design of a complete data center solutions (e.g., HP FlexDC and i/o’s Data Systems i/o Anywhere), others have moved the modular data center design concept down to the rack or row level (e.g., APC InfraStruxure or Emerson SmartAisle).

So what is the potential market? An August 2011 survey of Data Center Knowledge readers found that 35% are currently either using modular products or evaluating them with an eye towards adopting the technology in the next 12 to 24 months. Here’s how it broke down:

  • Implementing modular data centers broadly: 7%
  • Implementing selectively: 10%
  • Modular data centers in testing and development: 7%
  • Planning to implement in next 12 months: 3%
  • Planning to implement in 12-24 months: 7%
  • No plans for modular data centers: 65%

HP noted a projection by IDC analyst, Michelle Bailey that stated modular deployments will raise from 144 units this year to about 220 units in 2012; a large increase when considering the industries, usually slow, embrace to new technologies.

“Today’s data center is obsolete when taking modularity and the fast maturation of this market into consideration,” said Jason Schafer, research manager at Tier1 Research. “If data center owners and operators are not at least exploring and considering modular components as a means for data center expansions and new builds, they are putting themselves at a significant disadvantage from a scalability, cost and possibly maintenance standpoint.”

Tier 1, like many in the industry, was initially skeptical of modular designs. But it’s not alone among leading analyst firms in believing that modular designs have a place at the table. David Cappuccio, chief of research for the Infrastructure team at Gartner, discussed the growing appeal of modular designs.

“When planning for data center growth, it is important that all alternatives be reviewed,” Cappuccio said last year. “Newer modular design techniques and container-based solutions should be a critical piece of your analysis. When used appropriately, they can solve specific problems, while reducing capital costs and the time it takes to implement new capacity.”

Interestingly, modular data center designs will also be a focal point at the 2012 Uptime Symposium this May. Last year the Uptime Institute’s Professional Services unit hired a veteran of HP’s modular program, Debbie Seidman, as its new Director of Technical Services. She will be managing Uptime Institute’s delivery of Design and Facility Tier Certifications worldwide.

“I don’t think we’ll see the entire data center market going modular,” Seidman said. “It’s adaptable, compact, and can be less expensive in upfront costs. But you can’t just plug these things in; you need to ensure the infrastructure is in place.”

So it seems that with enterprise users embracing it, Uptime making it a focal point at their 2012 Symposium, and the industry starting to adapt to the idea, modular designed data centers might just be the wave of the efficient data center future.

Data Center Spotlight: EBay – Project Mercury

Spotlight Data center:

EBay – Project Mercury

Constructing an Ebay Data Center in Phoenix, AZ

Why Phoenix? Why would anyone build a data center in the middle of a desert that reaches temperatures of up to 119F°? And how in the world are they going to cool it efficiently? I know these were the first questions that popped into my mind when I first heard about EBay’s new Project Mercury data center, and my guess is you were probably thinking the same things.

EBay is currently in the midst of a major overhaul; moving away from renting and leasing facilities and into owning their own, maximizing their ability to better control efficiency. With the consolidation of 11 data centers into just 3 locations, all the while deploying tens of thousands of severs within a six-month time frame, came a massive challenge for the company.

“We needed to meet an organizational demand to consolidate data centers and deploy tens of thousands of new servers in under six months,” Dean Nelson, senior director for global foundation services at eBay, said in a statement. “As part of that process, we wanted to push the envelope in terms of what had been achieved in data center efficiency to date. EBay was able to achieve both goals because we took a metrics-based approach to the process, which included ensuring that our server and data center RFP processes used PUE and TCO to optimize our supply chain, taking a modular, scale-able, and forward thinking data center design, and aligning ourselves with The Green Grid’s DCMM from the beginning.”

Since the launch of Project Mercury, EBay has relied heavily on its partnership with The Green Grid to assist in the strategy and development for their newest data center. The Green Grid has since released a case study that outlines the innovations and approaches used in the facility in hopes of sharing the resources learned with the industry.

Containers on the roof of Ebay's Datacenter

You’re probably still wondering why Phoenix was chosen as the location for this project and not somewhere frigid and cold. Well Project Mercury was born out of necessity, not an attempt to push limits or prove a point. EBay simply built the data center in Arizona because it was needed; and with that understanding began the massive undertaking of building an efficient data center in an oven like environment. “If we can solve this in a desert,” says Dean Nelson, eBay’s senior director of global foundation services and the man who oversaw the project, “we can solve this almost anywhere.”

So how did they do it, you ask? The solution was to cool the facility with hot water. EBay embraces the idea of running some of its servers at higher temperatures than the industry standard, insisting that servers can run just fine at temperatures above 80°F. The water held in their outdoor tank can reach 87°F on the hottest days during the summer, but even this staggeringly warm temperature is cool enough to run the heat exchanges that keep the facility’s servers at operable temperatures.

With a PUE of 1.35 when there’s a 30-40% load on the servers, EBay has little to brag about. However, when the containers are considered individually, their PUE was a low 1.018 in January and only spiked to 1.046 when outside temperatures rose to 115°F in the middle of August. These numbers are incredibly low given the environment that they are placed in. Interestingly, up to 12 containers can sit outside on the roof in the direct Arizona sunshine and the data center is still able to operate efficiently.

Project Mercury hits home for two reasons: the first being more blatantly obvious, in the fact that the data center is located in our backyard, which is incredibly unusual given our extreme climate, and the second being that the facility uses ServerLIFTs to assist in the moving of their IT equipment. Both of these reasons make us equally excited and we are very happy to welcome EBay’s green data center to their new hot home.