Ever larger amounts of data are clogging up the cloud, so how can your organisation ensure that it doesn’t fall victim to the spectre of network latency and packet loss in the face of today’s cyber security threats?
In 2017, IDC’s ‘Data Age 2025’: The evolution of data to life-critical study predicted that analytics and the internet of things will drive data volumes to 163ZB by 2025. With one type of cloud or another being behind analytics and the Internet of Things, there is potential for latency to creep in and to render them ineffective and even useless. This brings us back to our age-old problem of moving data over distance because it increases latency and packet loss. This, in turn, creates a loss of network performance that can at worst make timely disaster recovery very difficult to achieve.
>See also: Cloud security – who should take ownership in the enterprise?
Cyber-attack challenge
Added to this is the spectre of the increasing number of cyber-attacks, which makes it even more of a challenge to get data securely to the cloud. One mistake people commonly make is to think that their data is safe if it’s stored in the cloud, and this leads to complacency – a feeling that they and perhaps even their organisations don’t need to do anything else to protect it. Yet, even when you think your data’s stored safely in the cloud, another issue arises because of it. Several cloud providers have had, for example, complete outages, with some of them lasting for days at a time.
>See also: Benefits of cloud computing security tools for data storage
So, to protect cloud data, it must be stored in at least 3 different locations that aren’t in proximity to each other. This may be in another region or even another country anywhere in the world. Another option may also be to hire the services of a second cloud provider, which will reduce the risk of downtime and allow for a failover if the primary cloud provider has an outage.
Plan for continuity
This also means that organisations need to plan for both business and service continuity. This is also if they need to prevent hackers from accessing their cloud-based data, or even data that’s in transit from one cloud to another. However, herein lies another issue: the growing data volumes are making it much harder to get data into the cloud in the first place. So, data security is only one part of the equation that demands a response – particularly as traditional solutions such as WAN optimisation and SD-WANs inadequately deal with latency, and WAN optimisation can find encrypted data troublesome.
>See also: Cloud-based solutions can help with digital transformation security risks
Previously, as with backup cloud appliances, organisations have relied on data deduplication techniques – more commonly known as WAN Optimisation. However, for the reason highlighted, with encrypted data transfer and storage in the cloud, this provides no performance gain across the WAN.
Golden backup rules
Despite this, there are some golden rules to apply when organisations implement a backup strategy. The key rule is to avoid designing for the backup process, as this is a controlled, scheduled activity. Instead of doing this, they should consider the recovery process. Why? Well, this is when things typically go wrong – especially in a mass data loss scenario, or whenever a ransomware attack occurs.
Many organisations are now turning to the cloud for backup and there are many cloud providers that provide this as a service. This is often implemented as a two-part solution: an onsite appliance with storage is the first step to receiving the backup. This can be used for an intermediate cache and then by a dedupe engine to minimise the data payload, which is sent to the cloud.
>See also: What are the true security risks to cloud infrastructure?
Whilst on the face of it seems to be an ideal solution, there are some shortcomings in the design. Firstly, until the last byte is uploaded to the cloud, the disaster recovery (DR) cloud copy is not complete. Then there is the issue of bringing back data from the cloud if the local cache is lost or corrupted by, for example, ransomware.
Data encryption
Pulling this data back and then inflating it can take a considerable amount of time, and just when you don’t need it. Lastly, most organisations are unaware that deduplication does not work with compressed or encrypted files. Yet many companies are using the encrypted facility available in backup tools now to protect the data at rest in the cloud for GDPR reasons. Companies then suddenly find that this method becomes a highly inefficient way of moving data to and from the cloud.
As a result, there is a need for a totally new approach to accelerating the data to and from the cloud. This is required to meet not only the SLA for backup, but also, more importantly, to comply with those critical SLAs for recovering the business after a data loss situation occurs. WAN data acceleration solutions, such as PORTrockIT, do not rely on data duplication techniques. This is so they can accelerate encrypted data – even deduplicated data over any distance, while at the same time mitigating the effects of latency and packet loss. They are, in my view, the way forward.
>See also: What should define an enterprise encryption strategy?
One of the benefits of using WAN data acceleration over data duplication is the low computational overhead of the process. This means that the previous performance ceiling of transferring data is lifted. It’s now possible to make full use of the more prevalent 10Gb/s WAN links with the ability to expand up to 40Gb/s of connectivity. This finally brings into reality a whole range of new possibilities, such as following the Sun workloads; or the massive migration of data to the cloud for web-scale computation tasks.
Role for deduplication
Despite this, data deduplication does still play a role, but it does not work for all data types, as it can become a large time-consuming computational overhead. This is especially the case when you re-inflate the data. With WAN data acceleration the transfer, it may be quicker to not dedupe the data before sending it to avoid the re-inflation process, which will greatly improve the recovery SLA. With the ability to move data at the same speed no matter the distance, consider how this could change your data processing – let alone your organisation’s ability to keep itself going in the face of a man-made or natural disaster.
>See also: 10 years on: why data deduplication is still going strong
Organisations, for example, need to be able to rapidly deploy their disaster recovery strategies whenever a cyber-attack cause downtime. Prevention remains better than a cure. So, it’s important to avoid relying on a single cloud provider to ensure that your organisation’s data gets to and from the cloud securely, while also ensuring safety whenever it is stored in the cloud. Organisations would then be advised to protect it by frequently backing up cloud-stored data internally on a regular basis. Leaving data security solely to a cloud provider can leave your organisation vulnerable, and so it’s important to consider all possible eventualities. This may include considering the deployment of WAN data acceleration, as it can secure data in flight.
Written by David Trossell, CEO and CTO of Bridgeworks