All storage administrators will be familiar with the fact that enterprise workloads are just as varied as enterprise applications.
Some workloads, such as online transaction processing (OLTP) and virtual desktops, are extremely latency sensitive, whereas others such as file shares and back up are throughput sensitive.
In addition, there is have virtualisation I/O which sits somewhere in the middle and can make it harder to distinguish workloads due to the ‘I/O Blender’ effect.
As a result, storage administrators need to be cognisant of what storage functionality is required and what best suits the workload they are operating.
>See also: 10 things storage vendors don’t want you to know
So, taking all of this into consideration, how are organisations expected to choose?
Much like reviewing features such as engine size, transmission and fuel type before purchasing a new car, before choosing a new storage system businesses need to review their buying standards to ensure they get a solution that best suits their needs.
That being said, there are some key aspects companies should take into account.
1. IOPS (performance)
This is a key factor. Traditionally performance has been achieved by scaling up and down the number of disks/spindles used. However, the introduction of hybrid architectures that leverage flash to accelerate performance has meant organisations can achieve the same performance with significantly fewer disks, spinning or otherwise.
2. Latency
Whether it is an end-user application (VDI) or a server application (database or mail servers), lower storage latency affects perceived and real performance. As a result, it is important for an organisation to review which applications are more latency sensitive and which require less attention.
3. Capacity
Reviewing storage capacity levels should be an ongoing focus for all organisations as the volume of enterprise data continues to increase, with IDC stating that “annual sales of storage capacity will grow by more than 30 percent every year between 2013 and 2017”. Scalability is crucial in terms of capacity; solutions need be have the ability to accommodate data growth as and when this is required.
>See also: Tape storage is not dead: 4 reasons why the world needs it more than ever
4. Storage functionality
Rich storage functionality also needs to be considered. This covers several important aspects such as access methodologies (not getting boxed into a single protocol), data protection (RAID, local snapshots, remote replication and DR support), high availability, integration into hypervisors and host operating systems.
While factors such as capacity and storage features are important, it needs to be mentioned that all the functionality in the world is useless unless an organisation can ensure its databases and other applications are well protected locally and remotely, and can sustain disasters.
5. Cost
There are a number of aspects that influence budgets; everything from acquisition costs to total cost of ownership will have an impact, not to mention the budget constraints of individual organisations. Businesses need to ensure they do not spend outside their means to achieve their goals; everything comes back to cost and return on investment (ROI).
Traditionally HDD (hard disk drive) has always been cheaper than its flash counterpart, but with the recent decline price of SDD (solid state drive) this may no longer be the case.
An ideal formula would consider cost per IOPS, cost per TB and ongoing management/support costs. Data reduction technologies such as deduplication and compression can significantly reduce the cost of flash systems.
6. Quality of flash
Finally, one often ignored, but highly critical aspect is the quality of flash used within all-flash or hybrid storage systems.
Multi-level cell (MLC) NAND, the most popular flash media used in all-flash and hybrid storage solutions, predominantly comes in two flavours: enterprise-grade (eMLC) and consumer-grade (cMLC).
The major difference between the two is the longevity of flash memory cells as defined by the number of times it can be erased and re-written (programme-erase cycles). It is important to be aware of the quality of flash in a storage array as it can greatly impact the lifespan of the array and reliability of data.
Flash is a revolutionary media, but before organisations get swept up in the tide it is important to take a balanced approach, keeping these criteria in mind.
In doing so, organisations should be afforded a storage solution that best addresses individual business needs.
Sourced from Paul Silver, Tegile