Before you decide whether to send your data to send into a public cloud environment, understand what it means and how sensitive it is. Thanks to modern classification technology, you can automate this process.
So, you want to take advantage of the flexibility and potential cost savings that the cloud offers. A single slip can make cloud-based data highly vulnerable, though, and you don’t want to end up as an unfortunate headline in the IT press. How can you decide what data to move to the cloud?
>See also: Hybrid cloud and blockchain solutions will be the future for data backup
To begin with, you need an infrastructure that gives you the choice. Hybrid cloud environments marry the control that you get from an on-premise solution with the flexibility of public cloud platforms.
This hybrid cloud approach has gained significant traction in the last few years. Wikibon and North Bridge Growth Equity Venture Partners surveyed 1351 IT professionals from vendor and customer organisations last year, and found that 47% of them had moved to a hybrid cloud environment.
Armed with computing, networking and storage resources that can exchange data seamlessly between your offices and in the public cloud, you’ll have the agility to store data whereever you deem it appropriate. That’s only possible if you understand your data to begin with, though. To do that, you have to classify it.
>See also: The spread of data – cloud and data protection for remote locations
Know your data
Data classification is an important part of the cloud governance process. It involves tagging digital files and records with metadata describing their characteristics. Armed with this information, companies can automate decisions about where to store their files, and how. This automation in turn enables them to scale these storage decisions across large data volumes.
This metadata can include information about a file’s sensitivity level, using predefined categories. It’s an approach that has worked inside UK government and intelligence circles for years, where documents have been classified as top-secret, for example. Now, digital technology makes this possible in the commercial world.
Other metadata ‘tags’ are useful when processing a file or record against a cloud security policy. Examples include information about who created the document, how long an organisation must keep it for, who handles it, and when they must dispose of it.
>See also: The rise of multi-cloud and data controllers
A file’s sensitivity level will often dictate where it a company stores it, with sensitive files staying on locally-managed servers. This isn’t always the case, though. Some cloud-based applications, such as Salesforce, must store sensitive information to do their job. In those situations, organisations need a subtler approach to data storage.
Tokens and encryption
Two related technologies are useful in situations that need you to store sensitive data in a public cloud environment.
The first is straightforward encryption, which protects data at rest. Some SaaS application providers will encrypt data for you, but beware of a subtlety here: Whoever controls the encryption keys controls your data.
If a SaaS or other cloud service provider holds those keys, you are relying not only on their internal security to protect your data, but also on their legal policies when responding to information requests from government agencies.
>See also: IBM triples cloud data centre capacity in the UK
If a government department in another country requests information about one of your customers from a service provider, will you have any control – or even knowledge – of its disclosure?
This makes control of your own encryption keys important when using this technology securely. That in turn comes with its own challenges. You must protect those keys, which may involve some form of hardware appliance. Lose them, and you lose your data.
Along with encryption, the General Data Protection Regulation (GDPR) explicitly mentions another technique – pseudoanonymisation – as a protective measure. It removes sensitive data such as customer name and address information from files and records, substituting unique data tokens.
An organisation stores copies of these tokens on its own computers along with that sensitive data. When someone accesses a record in the cloud, its software can use the token to look up the locally-stored record sensitive data. This prevents the company from having to store sensitive information directly in the cloud.
Cloud access security brokers (CASBs) can help you use these techniques properly. They serve as intermediaries between your on-premise and cloud environments, securing information with encryption and tokens as it flows between them.
These companies will often layer on some extra protective measures, such as data leak prevention. This analyses data as it leaves your premises and finds information in sensitive formats (such as credit card numbers). It can then apply security policies to that data, blocking it and informing managers of an inappropriate transfer attempt.
>See also: Cloud service providers key to avoiding data regulation penalties
Beyond these technology measures, ensure that only the right people have access to data stored across your hybrid cloud infrastructure. This means designing identity and access management (IAM) solutions that serve both on-premise and cloud-based environments.
Depending on the public cloud environment you use, you may find IAM included as a basic service, or as part of a marketplace offering. Amazon Web Services and Microsoft Azure both offer their own approaches. It may be possible to use credentials stored in an Active Directory instance on your premises to govern cloud-based access.
Armed with these techniques, your company can decide which data to best store in the cloud, and then take control of its protection. It all starts with understanding what you’re storing, and documenting that understanding. In a hybrid cloud environment, all data should come with a label.
Sourced from Danny Maher, CTO, HANDD Business Solutions