The unprecedented rise of data is something that meant that IT storage industry specialists have needed to take into consideration as a primary challenge when preparing for our data storage strategy in 2017.
Even with the rapid pace of technology advancement, the average end user is still struggling to manage legacy systems while attempting to navigate the benefits and capabilities of a range of vendors’ storage solutions.
Certainly, there needs to be a ‘demystification’ about storage so that organisations gain the clarity needed to make informed strategic decisions that suit their particular industry challenges, legislation and business workloads. An understanding of the current forces at play and where the market is headed will significantly aid in the creation of a smart storage strategy.
Firstly, budgets are not keeping up with storage needs: the former remains still or even decreases, while the latter continues to grow exponentially. A recent IDC report illustrates how in Europe the external storage market is shrinking. And yet we need as much storage as we produce data, and do we produce a lot of data.
>See also: Cloud storage is the new battleground in the cloud price war
Therefore, organisations need to come up with smarter storage strategies to deal with their exploding data, as opposed to continuing with the traditional behaviour of simply adding on more capacity, which is not a long-term viable solution. Moreover, budget constraints motivate organisations to extract as much value as possible from their existing investments, while higher performance is constantly demanded.
The driving force behind the need for smarter storage strategies comes from the value data has gained and continues to gain, as more and more data is generated. As data sets continue to grow, we need to find ways to store it, analyse it and re-monitise that data over time. Keeping this data forever allows the company to use it forever. This speaks very loudly to boardrooms.
Cloud popularity will not decrease, but the excitement around cloud storage, public, private and hybrid – which is now the most widely used – is likely to abate. This is because of continued budget constraints as discussed above, as IT decision makers discover the cloud is not necessarily the most affordable data storage option.
>See also: Enterprise cloud storage: usage and trends
For those organisations generating significant volumes of data and with high workloads, more cloud is often an expensive option. In 2017, we will see a lot of mix and match between hybrid clouds with other storage platforms.
CIOs and infrastructure managers will need to make hard choices, caught between accommodating their storage needs and the demands of the boardroom, where budgets are decided. Organisations will need cost-effective solutions that allow them to continue to offer their users high performance.
One benefit IT leaders can gain from this cloud world is that object stores are moving into the main stream. By using the same restful protocols, most private clouds are really just object storage. With this, many enterprise accounts are looking to object storage to ease the data management burden.
The issue with object storage is that protocols are not imbedded into the operating systems yet but more and more applications can talk directly to object storage. Organisations are looking to reduce complexity, middleware maintenance, upgrades and costs to simplify infrastructure and allow more integration between different tiers and object store provides a vehicle to this end. Along with the growing adoption of object storage, hardware producers are working harder to put forth joint solutions, rather than leaving it to the customers to put them together.
>See also: An insight into the cloud storage industry
End users will continue to want the most transparent ease of use, while IT managers will seek out better management capabilities, improved integration, increased simplicity with greater choice and flexibility, and the ability to respond to their users’ needs. Hybrid environments will be a higher priority as organisations continue to demand methodologies that enable them to move data from one format to another, quickly and efficiently.
There is a need for fast file transfers between primary storage, local tape and disk-based private clouds, or to any public cloud. This integration gives organisations the power of direct drag-and-drop data to the storage destination of their choice. Accordingly, we see will higher demand for tier storage automation as companies realise the importance of automation and hyperconvergence.
For the foreseeable future, file systems will be the primary first storage tier, there will continue to be a real need to identify and store cold data or secondary storage in a way that lessens the demand on primary storage arrays. As such, planning for archiving storage will be a priority in 2017 because cold storage that requires to be retained long-term, provides an economical means of safekeeping secondary data aligned with cost requirements.
>See also: The year of the cloud: flexible, agile and scalable
As such, enterprise tape combined with object storage can take a leadership position, specifically for companies looking to keep their data long term. For example, there is more and more video surveillance data around, which translates into a need for tape because it is low cost and effective for long-term retention.
All-flash arrays (AFAs) will continue to do very well. Flash will make capacity gains as it becomes a means to mainstream storage close to the processor and prices continue to fall. The consolidation use case represents the future of AFAs and will grow rapidly in the near future, since budget-constrained organisations will want to extract as much value as possible out of their costly AFA investments.
With the emergence of new technologies, the expected end user benefits usually centre on capacity, performance or both. However, this does not necessarily mean the user is getting the most out of the new technology, or in fact driving revenue to the bottom line.
>See also: Is Hadoop’s position as the king of big data storage under threat?
One consideration that is often overlooked when new technology is purchased is that people also need to be trained. You can’t introduce new methods or new technologies and expect your teams to immediately know the most efficient and quick way to use them. Therefore, there needs to be greater importance placed on training the end user on how to best exploit new technologies and approaches (e.g DevOps).
Because the value of data directly impacts a company’s bottom line, establishing how to implement a data storage strategy is pivotal to overall success. This requires research, careful planning, an understanding of the process and how to combine the best technologies to create a tailored, smart storage strategy.
Sourced by Matt Starr, CTO, Spectra Logic
The UK’s largest conference for tech leadership, Tech Leaders Summit, returns in September with 40+ top execs signed up to speak about the challenges and opportunities surrounding the most disruptive innovations facing the enterprise today. Secure your place at this prestigious summit by registering here