A glance at the IT storage industry press today would reveal an avalanche of marketing aimed squarely at convincing readers that the future of IT storage is “all flash” or SSD-based (solid state disks).
Indeed, with much of the messaging one would be forgiven for thinking that flash is the panacea for storage and that the HDDs (hard disk drives) are headed for a mournful demise. However, the truth couldn’t be further from the picture painted by this all-flash-marketing firehose. The hard disk drive industry is in fact very much alive and well.
According to IDC’s Worldwide Hard Disk Drive Forecast Update, 2016–2020, “Growing demand for HDD petabytes by enterprise customers continues to be a bright spot for HDD industry participants.”
>See also: Enterprise storage in 2017: trends and challenges
HDDs are extensively used – and in vast and increasing quantities – by a substantial number of enterprises of all sizes, and most certainly by every single hyper-scale storage provider in the world today.
With a mature market such as that of the HDD, some changes are to be expected as the industry evolves. As the IDC report states, “Along with the demand for more enterprise HDD storage capacity is a growing need for innovative, new enterprise HDD products capable of economically addressing a variety of data centre storage workloads.”
And there is much innovation in the HDD market as the hard disk drive becomes denser, cheaper, more reliable and more power efficient. And in a marketplace led by price as well as value, the relevance and evolution of spinning disk is as vital as ever.
There is no doubting the many benefits of solid state media. When it comes to the efficiency and speed of its cache memory capabilities, it truly shines. But as with anything that has been hailed as the new “golden child” of an era, when one tries to use it for everything, problems begin to arise.
The trouble is, during this time of data explosion in which humanity now finds itself, with the need for higher and higher storage capacities and densities, once flash drives are used as the primary and sole storage for petabyte-scale workloads, the harsh economic realities of flash vs. mechanical media start to appear.
>See also: Flash storage: transforming the storage industry
Although data reduction capabilities generally applied to flash media do help chip away at the shortfall in cost per usable unit of capacity, data reduction is neither free nor deterministic. A system that relies heavily on typical data reduction methods to mitigate cost is highly likely to be suffering from performance degradation – especially during peak workloads, when it can least be afforded.
In addition, applications and services are increasingly clever about compressing and optimising data ‘footprint’ at the point of origin. And trying to compress data that is not very compressible to begin with can result in the data reduction ratios quoted by the all-flash vendor at the time of purchase, deteriorating into considerably less impressive figures as time passes – with effective costs potentially rising to double or even triple what was originally expected.
Enterprise requirements both now and over the next decade, are consistent and fairly easy to codify: extremely high performance and availability over petabyte-scale data sets at a vastly lower cost per gigabyte. And if the forecasts of the HDD industry are to be accepted, the innovation in this market will continue to produce cost reductions, which in turn will make new things possible, such as the continued price erosion of storage.
In fact, this is already happening. And that should speak volumes to smart CIOs keeping their eyes firmly fixed on TCO (total cost of ownership) figures, rather than being swayed by the siren’s song of the all-flash mermaids lulling their organisations into a potentially rocky future budget crisis.
>See also: Non-volatile memory express – a revolution in storage
The destiny of data storage isn’t necessarily going to be “all-flash”, nor should it be “all” anything. Why should implementation or procurement decisions be based purely on media? Or worse yet, based on a single class of media? Designing a storage system based solely on the media it uses is like constantly retrofitting the same 1968 VW Beetle with each year’s newest, most advanced lightweight Formula 1 racing engine. There will come a point when the overall design – chassis, exterior, frame, curb weight, controls and safety features – has to be re-engineered to optimise overall performance and capability.
For these reasons, an all-flash data centre is not the most sensible or sustainable model for the future of multi-petabyte-scale workloads for all organisations.
Furthermore, the economic disparity between flash media and higher density magnetic media could become a significant barrier to innovation and adoption as the volume of data we globally create, move, store, manage and act upon continues to grow at accelerating rates.
Rather than force-fitting a technology into an environment, when developing a new storage platform a number of key factors need to be taken into account from an outside perspective, going beyond the technology itself.
>See also: Software-defined storage: a crossroads for businesses?
The application of that new technology must be carefully considered; where and how it is going to be used; what the environment and cost structure is going to look like; what vertical industries will be applying it; and consideration of the channel ecosystem. Unfortunately, the trend is that those factors tend to be saved for last, with some vendors developing new whiz bang solutions often characterised by an ageing architecture, with some new foundational layers or media built into it, without really thinking of the end game.
Successful decision-making regarding storage procurements should be based around workloads, TCO and the business needs of the present and future: future-defined, not media defined.
As compelling as advanced technology such as all-flash may be, it will always work best when applied in the right way, with CIOs keeping a firm eye on the bottom line. It’s always wise to be wary of technologists focusing purely on the technology itself, especially when and where that may be making the infrastructure more expensive, not less.
Sourced by Randy Arseneau, chief marketing officer, INFINIDAT