Not so many years ago, some analysts were convinced that the cost of storing data would continue to fall so fast that organisations would soon see it as a negligible overhead. But that’s not quite how things turned out.
Even as users have had gigabytes or even terabytes of capacity made available to them, the overall cost of providing such capacity has soared.
In fact, market research company Gartner predicts that spending on storage systems will take up half of businesses’ IT spending over the next few years,
|
||
with no end in sight to the explosive demand. Indeed, observers such as Joe Tucci, CEO of storage giant EMC, point to demand for storage increasing at 50% a year.
The technologies required to meet that demand for ever-higher capacities seem to be holding up well. In terms of the amount of data that disk makers can pack onto a disk (the so-called areal density), levels have been rising at around 60% a year.
Industry insiders, such as Kevin Libert, enterprise systems director for Europe at Dell, expect the rapid fall in raw storage prices and the exponential rise in performance to continue for at least the next 10 years.
As a result, Libert sees no pressing need for a radical shift away from the conventional technology of spinning disks – in favour, say, of flash memory – at least not for core enterprise storage applications.
Limits of growth
Others think the industry may be coming closer to the limitations of disks. “Within the next five years, it is anticipated that magnetic disks may encounter a physical barrier known as the super-paramagnetic limit, where magnetic particles are so close together that they interfere with each other,” says Fred Moore, president of Horison Information Strategies.
Efforts are underway to overcome that ‘barrier’ (just as many others have been cleared before in storage), extending the life of disks into the next decade.
But if disk designers can continue to squeeze ever-greater capacities from their technologies and support the storing of ever-more data, then the locus of storage will shift to the real pain point: the management of that sea of data. Already a major issue at large organisations, this will become more and more of a challenge. As Moore observes: “The storage management capability is not keeping pace with storage growth.”
Software focus
For the next couple of years, management software will be the storage sector’s main focus as it seeks to provide greater automation of storage administration tasks – the streamlining of retention policies, access management, archiving and other key areas that require direct intervention today.
There is also likely to be renewed interest in storage services where third parties seek to handle the complexity of holding and managing data stores. After a few lean years, Gartner is predicting that demand for such services will reach $30 billion by 2007. And indeed, even systems and software vendors such as EMC say they expect to derive half their revenues from storage services in the near future.
Fuelling the economics of such services will be increased take-up of storage area networking (SAN) technology. Prices of SANs are already falling. Dell, for example, expects to market a SAN for under $10,000 later in 2004, acknowledging that early adopters of SANs have benefited substantially by pooling their storage and making it available to multiple servers.
One of the major changes spurring that take-up in coming years will be the ability to mix and match different vendors’ products in the same network.
Although smaller businesses might be content to source their SAN components from a single vendor, larger companies, faced with pressure to consolidate storage and the frequent requirement to accommodate systems coming into the business as a result of mergers and acquisitions, are demanding interoperability.
Standards are reaching a viable level and products that support true SAN interoperability should start emerging from vendors over 2004 and 2005.
SANs will gain even further ground due to the fact that they make it easier for companies to manage their business continuity and regulatory data access and archiving needs. Increasingly, businesses of almost any significant size require some form of near-line, off-site backup. The perception is that locally stored tapes, even in a fire vault, are just too vulnerable, and as data volumes continue to grow, recovery from tapes becomes slower.
SANs should also help companies implement information lifecycle management policies, where data can be moved from production storage to cheaper disks units and eventually to tape for long-term archiving based on access priorities.