In conversations about the future of the data centre, two spectres threaten to undermine many of the more promising developments: heat and power. Too much of one and too little of the other.
With the advent of dense processing configurations, such as blade servers, more computers than ever are being squeezed into data centre racks. Limited space is adding to the problem: In many cases, as companies expand, they need to cram more computing capacity into the same few hundred square feet – especially in large cities like London, where land is at a premium.
"I don't think it's a crisis yet but it's a growing problem," says Mike Tobin, CEO of Redbus Interhouse, a data centre co-location service provider. "Racks that used to carry four or six servers now carry 40. So you need ten times the amount of power and cooling going through each rack. We pull 20 megawatts into our data centres – you can't get that kind of power in the City of London."
Guy Willner, CEO of data centre services company IXEurope, agrees. "Five years ago a rack was using 5kW – now with some of these racks, it's the equivalent of getting a wardrobe and putting twenty 1kw electric bar fires in and then plugging them all in – it's pretty hot."
Many of today's data centres were designed with a very different generation of servers and even mainframes in mind. As well as being smaller and more densely packed, today's machines consume more power. Nor is it something which IT directors have typically felt they needed to worry about. Alireza Mahmoodshahi, chief technology officer of business communications supplier Colt Telecom, says they shouldn't have to: "What we need to do is work with the manufacturers, to reduce the heat generation from the boxes. But they normally don't do that if we don't force them, because it costs them money in development."
Of course, one option is to hand the power problem over to an outsourcer. "It's damned expensive for customers to put in 24kw per rack and it will probably outweigh the benefits of any space saving they get from blade servers," says Willner. "And there is no real alternative [to using data centre services] because if you put it in your own offices you might even be contravening fire regulations with that sort of heat generation."
Outsourcing is not the best fit for everyone. For example, Niels Roberts, data centre manager at EDF Energy, evaluated co-location and found it to be more expensive than having his data centre 100% in-house. "We're very focused on driving down the cost of our operation and we periodically benchmark ourselves using external providers. At the moment on the numbers that we see, we will continue to run everything in-house but [managed hosting] is certainly not something that we'd close our mind to."
But IXEurope's Willner argues that companies need to make sure they are taking into consideration all the costs associated with looking after their own data centre. "The security guards that you have on site 24 hours a day are there to protect the data centre, not the cleaning lady," he says. Roberts acknowledges that if he needed to rearchitect to house new blade servers, he would probably look at co-location more seriously.
Colt's Mahmoodshahi warns that co-location and managed services providers could soon face the same heat problems as their customers. He foresees a time when power limitations are set for a customer's rack. "If you exceed that power limit your price goes up. That will give incentives to the customers to go back to the manufacturers of blade servers, because right now they think they're getting something for nothing and I don't think we can stay like that forever."