If there is one thing that IT industry analysts can agree on, it is that the adoption of cloud computing is set to grow. They may differ on the pace of that growth, and which cloud model will grow the fastest, but all agree that more and more business data will in future be stored, transacted and processed in cloud environments.
Besides the benefits to scalability and resource allocation that this entails, it also means that the ecosystem of business data is about to get more complex by an order of magnitude.
Today, a business can be reasonably sure that the data it cares about is stored in IT systems operated by itself, by its partners or by its suppliers.
Such is the nature of the cloud computing ecosystem, however, that the business may soon be using a software-as-a service application from one provider, built on a platform-as-a-service offering from another, which in turn is based on infrastructure-as-a-service delivered by a third.
This complicates the essential matter of knowing where one’s data resides, and in exactly what kind of technical environment.
Interesting Links
Putting the London insurance market into the cloud Uncertain how London’s insurers would take to its e-brokerage platform Lime-St.com, TriSystems opted for cloud-based hosting
Securing mobile access Corporate finance advisory KBR used cloud-based service Simplexo to remove the risk of mobile workers
Exploiting the human weak point Author and consultant Ian Mann explains how he got the better of a SaaS provider’s security precautions by outsmarting its customer support agents
It is therefore incumbent upon all IT professionals, whether or not their current employer is pursuing a cloud computing strategy, to understand the legal, technical and organisational security issues associated with the technology. These issues were the subject of Information Age’s Cloud Security 2011 conference in October.
From a legal perspective, the location of data and data centres is the key issue in cloud computing, explained keynote speaker Rosemary Jay, senior attorney at law firm Hunton and Williams’s privacy and information management practice.
Most of the big-name cloud providers now offer services from within the EU, which removes many of the legal concerns for UK companies wishing to use public cloud services. If a company wants an individual guarantee that data will never move outside the EU, however, cloud providers are not always cooperative, Jay said.
“If you are trying persuade Google to only locate your data in the European Economic Area, it’s much easier if you are a big customer with commercial muscle,” she explained.
A common concern among clients, Jay says, is the impact of the Patriot Act, which allows US authorities to access data held by any US-owned corporation. Technically, this clashes with the EU’s Data Protection Directive, which asserts that third parties cannot access customer data without their consent.
Jay said this is often over-played, as most businesses are unlikely ever to be the subject of a national security investigation. “Why would they be interested in your data?” she remarked.
A more valid concern is the growing complexity of the cloud ecosystem. In a multi-vendor cloud environment, multiple different parties may be defined as the ‘data processor’, in legal terms, but the role of ‘data controller’, and therefore ultimate responsibility for the data, remains with the customer. Jay therefore advised that companies seek individual agreements with each ‘data processor’ in the cloud service supply chain.
She reminded delegates that businesses are bound not only by the law, but also by the agreements they have made with customers, such as privacy and data protection policies. This can be problematic, Jay explained, as these are often designed with commercial, not legal, interest in mind.
“Sometimes, I see data protection clauses that make me think they were written by the marketing department,” she said. “You see phrases like, ‘We will cherish your data,’ cropping up, which are meaningless.”
NEXT>> Due diligence in selecting a cloud provider
Page 2 of 2
The theme of managing risk in the face of a complex cloud computing ecosystem was also discussed by Nick Law of cloud consultancy start-up sineBridge.
Law explained that the amount of due diligence an organisation must undergo when selecting a cloud provider increases as one moves up the cloud stack from IaaS, though PaaS and into SaaS.
When using low-level infrastructure services such as virtual machines or storage capacity in the cloud, the customer is still largely responsible for the security of the system. It is the customer who chooses how to configure that infrastructure, what software will run on top of it and exactly how it will be used.
During due diligence, therefore, the customer has a limited number of fundamental questions to ask potential suppliers, such as where the data is hosted and what happens to it once the customer deletes their account.
As more functionality is added to the cloud service, the provider is taking on more responsibility for the security of the data. The customer is therefore handing more control over to the provider, and due diligence must therefore be more exhaustive.
Interesting Links
Putting the London insurance market into the cloud Uncertain how London’s insurers would take to its e-brokerage platform Lime-St.com, TriSystems opted for cloud-based hosting
Securing mobile access Corporate finance advisory KBR used cloud-based service Simplexo to remove the risk of mobile workers
Exploiting the human weak point Author and consultant Ian Mann explains how he got the better of a SaaS provider’s security precautions by outsmarting its customer support agents
For example, when choosing a PaaS provider, the buyer must ask all the same questions as for an IaaS, but also new questions about database security and developer access rights. When selecting a SaaS provider, questions about user identity and access management come into play.
Law echoed Rosemary Jay’s sentiment when he said that every degree of control that is passed to a cloud provider must be counter-balanced by a degree of due diligence, because the customer’s responsibility for the data remains constant: “It is possible to transfer commercial risk to your cloud providers, but all other risk remains with you.”
Focus on the data
Andrew Bushby, UK information security director at Oracle, advised delegates to grill potential public cloud providers on their database security measures.
Improper access to the database can be catastrophic, and in more ways than are typically acknowledged, Bushy said. He cited a customer that works in the financial services industry, who said that its biggest concern is not someone stealing its data, but someone tampering with it to influence public markets in their favour.
Access to data within the database must therefore be managed in a more sophisticated fashion than is currently the norm, Bushby said. “Database administrators typically have access to everything in the entire database,” he explained. “Should they? No!”
Instead, Bushby espoused a system where access rights can be managed so that DBAs are privy only to the data they need to do their job. Similarly, data should be encrypted within the database, not just when it is in transit.
A similar warning was sounded by Andrew Yeomans of the Jericho Forum, who argued that cloud computing is yet another force in technology that is making the traditional, perimeter-based approach to information security untenable.
Where once organisations might have been able to lock the entry points to a network using a firewall, cloud computing means that businesses will have to find ways to secure data, even as it moves across networks and organisational boundaries.
“Perimeters are disappearing from networks,” Yeomans said. “In 2011, we’re in a crossover between network controls and data controls.”
This data-centric approach to security requires more fine-grained control of end-user access, and therefore more sophisticated identity management.
Happily, this may save organisations money, as it will allow them to apply only as much security as is required, Yeomans argued. “By making your protective steps fit with the sensitivity profile of the data you hold, you can get the most cost-effective solution,” he said.