“Data centres can mean different things to different people. On one hand a data centre might comprise a rack or server in one room; on another it’ll be a fortress worth millions of pounds.” Fortress or not, says Lawrence Harvey, the director of UK data centre development for information services giant Experian, they all need protecting, because an attack on a company’s data centre, no matter how large or how small it may be, can be crippling.
Data centre security, if Harvey’s observations are right, is often lax to the point of corporate negligence: “Security per se is not actually the issue. The real problem is carelessness.” He cites the example of one company whose data centre’s exact location was highlighted on its architect’s website. Not a good idea when you are trying to protect the physical site, he says.
Even IT people are blind to some risks. Instances of companies backing all their data up to systems that sit literally next door to the main computer room, for example, are not uncommon. As he says: “There’s nothing to protect the income stream of a company if their back-up server is under a desk."
Lawrence Harvey
In 2003, when Experian embarked on the construction of a purpose-built, £31 million European data centre outside Nottingham, its head of infrastructure Lawrence Harvey took on the pivotal role of programme director for the migration to the new facility – a high-security complex that now houses three IBM mainframes, some 600 servers, 100 terabytes of online and 2 petabytes of offline data. As well as ensuring data centre service delivery in EMEA and Asia-Pacific, Harvey is now also deeply involved in the company’s acquisitions strategy.
Just as in any discussion about security, the most dangerous and least predictable element is the human one. “It’s easy to protect small host systems, but it’s difficult to protect against diverse people,” he says. He emphasises the importance of remembering that ‘trojan horses’ do not necessary have to be computer virus carriers, they can come in human form too as individuals infiltrate the organisation.
One way to limit the kind of impact such ‘sleepers’ can inflict is through segmentation. “You need to zone off the production services from the external access points, and the internal users from the host system. Moreover, you need to segment the developers off from everyone. Developers always think they have a reason for admin rights, they want to turn off, say, the anti-virus, while they let new things load. It’s crazy.”
Firewalls are a big help but they can easily become too complex. The more points of access to the network and the more sophisticated the configuration of devices for customers, the more intricate the rules become about exactly who can access what and where, he says. Regular ‘housekeeping’, every six to eight weeks, is needed to keep access rules up to date and so minimise false positives – refusing entry to legitimate users. Alongside the near continual updating of the network and its devices, Harvey recommends that organisations undertake full penetration testing every three to four weeks.
But it is an uphill struggle. The raft of current menaces is growing, from spam and denial-of-service, to identity theft and phishing, and they are all becoming more sophisticated – with attacks increasingly orchestrated by organised criminals, industrial spies and political activists.
Harvey tells of one potentially catastrophic incident he became aware of that originated in China: emails were sent out to politicians and their research-ers which, when opened, automatically returned locally held documents to the source of the email. The only truly safe computers might be one that is turned off, detached from the network and sitting behind locked doors, he says, but the right mixture of monitoring, penetration testing and updating host systems can go a long way to reduce the risk to the data centre.