When choosing a location for a data centre, it is a brave manager that would select a spot on a 200 metre-thick ice sheet, where the summer temperatures peak at -5C before dropping to -30C in winter, and where there is no daylight for 55 days of the year.
For David Blake and his team, there was little choice: they work for the British Antarctic Society (BAS). But if the expectation of providing IT to researchers in the most hostile environment on the planet was not challenge enough, the latest project – equipping the next generation of Halley research centres – promises to make that tough job just a touch harder.
Halley VI, like its predecessors, will be situated on the Brent Ice Shelf – a 200 metre thick sheet of ice. The most recent research centre, Halley V, is at risk of falling into the sea, as parts of the ice sheet are breaking up. But despite the extreme location, the server room for the Halley VI centre will look very familiar to most IT managers. It will have a 1GB Ethernet backbone, run mirrored disk storage systems, and use Novell's network operating system.
Indeed, one of the biggest challenges for Halley VI, that of powering and cooling the server rooms, is common to most IT organisations. But there is an added problem: the cost. On the Antarctic ice sheet all electricity has to be generated in the complex. That requires fuel for the generators. The shipping costs alone for one drum of fuel works out at around £180, so Blake has to make sure power consumption is at a minimum.
To get round the problem, alongside traditional power-saving applications, the computers will also be based on processors that operate on low power, ensuring that power consumption is at an absolute minimum. Inevitably, operating computers generates heat; to ensure that Halley VI site is as fuel efficient as possible the hot air generated by the site's servers is then recycled to help heat the centre's water.
Given its remote location, the Halley VI research station will require a dedicated IT engineer on site. Along with the 22 servers and various desktops, he will have to support the satellite communications infrastructure, the Internet link and the range of bespoke devices BAS employs. "Out in the field, we tend to have highly ruggedised equipment," says Blake, "but we also have some special requirements, such as heating containers for PDAs. A lot of this we make ourselves."