With the Covid-19 pandemic causing economic uncertainty, choosing the right investments to make in cloud storage is as important as ever, if not more. Cloud tiering is coming to the fore as a way to store unstructured data and assets that have minimal use. But this is only effective if the right infrastructure changes are made.
A notable example of where this process can be useful is for object storage of backup data. This type of storage has infinite scaling capabilities and is compatible with all sizes and formats. You can use cloud tiering for object storage, resulting in having backup data properly organised and keeping low-priority assets away from expensive tiers.
What storage tiers are available?
Cloud tiering solutions are designed for destinations that can handle masses of unstructured backup data, such as Amazon Simple Storage Service (Amazon S3) and Azure block blobs. Typically, data can be placed into the following tiers, all of which have varying price ranges and uses:
- Local tier, for general user files such as documents and email attachments, to be retained for up to 30 days;
- Remote site tier, also for user files, to be retained for up to 90 days;
- Hot and cold cloud tiers, for general backup data, to be retained from three months up to three years;
- Archive and deep archive cloud tiers, for long-term regulatory compliance, to be retained for multiple years.
Here, the types of data that are less used but are needed for regulatory reasons, stored in the cloud, are kept secure by not being possible to be accessed or restored for at least a year.
Using cloud tiers to store backup data comes with the benefit of monthly operational expenditure (OpEx) that’s easier to control than capital expenditure (CapEx), the cost model that comes with the two most shallow tiers shown above.
Gartner identifies 10 ways for CIOs to quickly reduce IT costs
Why not store all backup data in the cloud?
Even if recovery time for retrieving data from either of the two cloud tiers isn’t a problem for your organisation, the case for keeping some backup data away from the cloud is two-fold. Firstly, cloud storage comes with transaction costs, with CapEx models coming with higher costs per gigabyte to request data.
Secondly, network costs are likely to increase when retrieving backup data from the cloud; maintaining a WAN link for cloud tiering requires additional bandwidth, the costs for which are not included in data transfer charges.
Having a combination of CapEx and Opex approaches, as well as cloud and on-premise, provides the benefits of both low overhead and pricing of the OpEx, alongside the short time to first byte (TtFB) of the CapEx model.
Hot and cold vs archive and deep storage
Data stored as hot and cold objects in the cloud can be retained for up to three years, while assets categorised as archive and deep archive can be kept for even longer. But what else needs to be considered when choosing which of the deepest tiers to use?
Hot and cold object storage is best for preparing for audits carried out by IT departments. As such, the kind of data that is suitable for this tier, while not required as regularly as local or remote storage data, should be available for when the audit takes place.
Meanwhile, the long term archive and deep storage is best for demonstrating compliance with governmental and regulatory bodies and laws. Because this data is the least used within the enterprise, it needs to be kept in the cheapest tier.
World Statistics Day: The search and need for trusted data
Keeping data secure in transit
QoreStor from Quest secures backup data using built-in encryption, secure connect and FIPS 140-2 compliance. The platform’s deduplication capabilities prevent you from making the costly error of inadvertently sending duplicate data to the cloud. Also, QoreStor compresses the data, which along with this reduplication process, eases the strain on the wide area network (WAN).
With economic uncertainty rife and regulation continuing to evolve, you need to make the best investments possible for backup storage, while ensuring that data compliance criteria is met.