The data centre is the focal point of many of the most pressing concerns in IT. By one token, the data centre is the physical embodiment of an organisation’s information processes, where its information-sharing practices are made real in the form of metal, heat and power.
At the same time, the consolidation of data centre resources allows organisations to consume data centre services in abstraction, creating flexible, cost-scalable and hybrid IT environments.
In one day, Information Age’s recent Future of the Data Centre conference covered the latest developments in the field, from managing heat and power to the possibility of all-software-based data centres.
Interesting Links
It began with an economic and political view of the data centre sector from Stephen Norris, former transport minster and now president of the trade group the Data Centre Alliance.
Data centres, Norris argued, are a unique class of infrastructure: the more energy they consume, the more they save. “On every occasion where we as a data centre industry have taken a company’s data requirements away from an office environment, [the data centre] manages the energy more efficiently,” Norris said.
Norris accused the government of failing to understand this, pointing to energy efficiency policies, such as the CRC Efficiency Scheme, which punish data centre operators for aggregating their customers’ IT-related energy footprint.
This is especially myopic, he says, at a time when the government is placing high hopes on the UK’s digital economy. “Britain has an open and transparent economy, and at the heart of it is our success in the digital age,” said Norris. “The UK needs more energy-resilient data centres to be competitive.”
Norris argued that the data centre industry needs to reframe the debate, articulating the point that – assuming demand for digital services will continue to grow – data centres will help cut the UK’s overall energy consumption. “We have to present data centres not as a problem, but as a solution.”
And demand for digital services certainly will grow. According to Dr Ian Bitterlin, CTO at ARK Continuity, both the world’s computing capacity and demand for digital services are growing faster than Moore’s Law – the prediction that processing power will double every 18 months. Referencing the work of futurist Raymond Kurzweil, Bitterlin proposed that both demand and processing power are now doubling every 1.2 years, not 1.5.
This is borne out by industry predictions, he added. Networking equipment-maker Nokia Siemens predicts that total mobile data volumes will increase 2,500% between now and 2015, while market watcher IDC expects combined global data volumes to grow 4,400% between 2009 and 2020.
That, of course, means that the IT sector’s energy footprint has to grow. “If you believe the power growth that the industry is predicting,” Bitterlin says, “we will be using 10% of the national grid anywhere between 2014 and halfway through 2015.”
Most large data centre operators are in the process of cutting down the power usage effectiveness (PUE) of their facilities, primarily by making cooling systems more efficient or by using them less. But demand will even outpace this progress, Bitterlin said.
“If we build more efficient data centres, the demand for power will still keep going up,” he explained.
Bitterlin argues that continuing growth in demand will therefore drive even further consolidation in the data centre market, as only the larger facilities will enable low-cost digital services in the face of escalating energy prices.
Energy in focus
By contrast, Jim Hearnden, enterprise technologist and member of technology body IEEE, said that rising energy costs can be contained using efficient temperature control in the data centre.
“There’s a sweet spot of around 25 to 27 degrees,” Hearnden says. “We say to customers: up your temperature and look to see where your areas are getting hot, and where you’re overcooling, and balance them out by moving tiles or by moving hardware around to get your optimum temperate across the data centre.”
However, Hearnden warns that increasing the temperature can present issues. “The main issue of increased temperature is the UPS, where the optimum for batteries is 22 degrees centigrade,” he said. “At 27 [degrees] it reduces the battery. Ten degrees up and you begin to half the life of lead acid batteries, and could end up replacing them every 18 months to two years, rather than their three- to four-year life-cycle.”
Interesting Links
Hearnden encourages data centre managers to monitor temperatures at the server level, the rack level and in the air around the data centre. “Rack levels will show up hotspots and spots where you’re wasting cooling air, but try to separate the IT from the plant itself. The last thing you want is somebody coming to fix the air conditioning working around the racks.”
Another factor that can affect data centre performance is the applications themselves that are hosted on the servers. “If you look at HR, the payroll has to go out,” Hearnden says. “But if users can’t log onto HR to do an internal exam and training, is it the end of the world? Probably not, and you’ve put that usage and application onto a highly usable server.”
With the multitude of potential areas in which wastage can occur in data centres, when it comes to building new ones there is a clear case for prevention rather than cure.
Craig Felton, senior data centre consultant at PTS Consulting Group, said that using software that simulates data centre designs can catch issues before they happen. “The objective of the data centre simulation program is to de-risk the geographic location and build highly resilient data centres to last ten to 12 years,” Felton said. “By simulating, we can deliver data centres to time and budget and make sure they are fit for the operations that we’ve built them to achieve.”
It is not uncommon for businesses to simulate their data centres at the build stage, but Felton advocates maintaining a simulation during ongoing operations. This allows organisations to predict the impact of introducing a new server rack or item of power infrastructure, which may otherwise upset the flow of air through the data centre or affect the distribution of electricity through the site.
In this way, he explains, simulation allows organisations to accommodate new data centre technologies without the risk of unintended consequences.
The software-defined data centre
Data centre simulation and operating temperature control are just some of the techniques available to organisations who operate their own facilities to master the physical constraints that dominate data centre management.
Many organisations, however, have chosen to outsource responsibility for some or all of the physical components of a data centre, through co-location, hosting or even cloud computing services.
As Anthony Dickinson, service director of cloud computing at GlassHouse, explains it, the market for what might be loosely termed data centre services ranges from physical, hardware-focused offerings to services that have been entirely abstracted into software. As IT services are abstracted away from the hardware level, Dickinson said, they also become more commoditised.
Interesting Links
“At the top of the marketplace we have mainframes, which are still sitting in data centres and haven’t been uprooted because they’re required,” he said. “But then down at the lower end of the market came client server, which disrupted mainframe and computers, and is now the dominant architecture for deploying x86.”
At the very bottom, Dickinson says, lie infrastructure-as-a-service offerings like Amazon Web Services, which are making their way into the enterprise.
As technologies move up what Dickinson calls the “commoditisation curve”, the rate at which improvements are being delivered exceeds the rate at which improvements can be consumed, he says, allowing for disruption of what came before.
“Data centres are going to need to deliver a number of new capabilities,” Dickinson said. “We’ve had server virtualisation for a number of years, known and trusted, and we’re starting to see true virtualisation of networks now, getting over the limitations of current LAN technologies, allowing data centres to be software reliant.”
Abstracting IT systems away from hardware, through virtualisation and other cloud-related technologies, is giving rise to new permutations of data centre infrastructure. According to David Hall, commercial and strategy manager at TelecityGroup, the co-location provider’s customers are increasingly integrating their own data centre environment with those of cloud providers and business partners to create hybrid environments.
“As well as installing equipment, and connecting to a carrier and connecting back to the office, we’re seeing customers connect into other customers,” Hall said. “We’re seeing people connect in the data centre to some cloud providers, and maybe connect to market data providers, producing a much richer web of connectivity inside the building.”
Hall explains that a common barrier to cloud adoption – being locked into one provider – is avoided with the use of a hybrid model. “In a hybrid model, you’ve got the ability to have some of your own equipment,” he said. “So a very common way that we see people deploying equipment is to have their own storage platform and some core application platforms on their own systems connected to one or more cloud provider.”
There are, however, still organisations for whom the simple act of using co-located data centre services offers substantial business benefits. Russell Warman, head of service delivery at Trader Media Group, told the conference that using Telecity’s co-location services has allowed the company to focus on its website and mobile services.
“We use a lot of the data we get from our website to get back to customers to ensure that they’re making the best decisions around their business,” said Warman. “We did this through investing in IT to build a stable and secure platform, and we also developed a suite of mobile products and other offerings.”
Warman said that the company had key considerations when moving out of their data centre. A carrier-neutral data centre was required, he said, which had to be located in Manchester and close to a Tier 1 ISP to give the firm low latency when delivering out mobile applications.
“There were constraints around the physical power that we could provide to the site we were in,” he said. “We were hitting hard limits with power, and we were also faced with cooling, scalability and expansion challenges.”
Warman said that moving to a co-location centre left the company designing apps and focusing on delivering value to the business. Other advantages included knowing that trained staff were managing the air conditioning, properly configuring servers and delivering plant structure.