With its debut five years ago, thin provisioning technology promised to take the sting out of one of the perennial problems of enterprise storage.
In most organisations, even today, excessive amounts of storage capacity are over-allocated to applications so they can meet volatile growth expectations, assure service levels and avoid future upgrade complexity. But the downside of that worst-case provisioning is that only a fraction of the allocated capacity is ever pressed into service.
That “monumental waste of capital and related operating expense” (as thin provisioning pioneer 3Par has called it) may have been bearable in previous times. But as data centre space constraints and energy costs have grown painfully acute, over-provisioning has increasingly been a luxury – and even a badge of eco-irresponsibility – that many would rather not have.
The different implementations of thin provisioning – from the market makers such as 3Par, Compellent, NetApp and Pillar Data Systems, and more recently from the enterprise storage latecomers HDS, HP, Dell and EMC – try to get round that ‘economically idiotic’ situation, as analyst Steve Duplessie, founder of industry adviser Enterprise Storage Group, likes to describe it.
But even after half a decade of intense marketing, demand for the technology is far from universal – and for some pretty good reasons.
Some storage managers – whose jobs rest, in many cases, on ensuring their organisations never, ever run dry on storage capacity or performance is not adversely impacted by a shortage of storage space – are uncomfortable with the risks associated with thin provisioning. They have discovered – often the hard way – that thin provisioning is not suitable for all applications and file types. And they also recognise that some vendors have a mature and highly effective implementation of thin provisioning while other are still dabbling with the technology or find its functionality conflicts with many of their other storage system priorities.
Weight watchers
To listen to its evangelists, though, thin provisioning is a panacea for many storage headaches. “Thin provisioning drives utilisation efficiency of disk systems. Organisations that are running at 10% to 15% with their direct attached storage and maybe 40% to 50% on their storage area networks have the promise of achieving 70% to 80% or higher,” says Craig Nunes, VP of marketing at 3Par, the company credited with launching the sector.
He cites the benefits: “Less capacity, cheaper capacity, improved sustainability with the same results [as traditional ‘fat-provisioning’].” And there’s more: lower total cost of ownership through the purchase of fewer disk systems, a lower administration overhead, lower energy bills through power and cooling savings, and a smaller footprint for storage systems in the data centre.
The secret lies in cheating applications into thinking they have more storage space than is psychically available in the system. “The volume claims [to the applications] to be 4 terabytes but it is actually only backed in storage with 600 gigabytes, for example,” says Eric Schott, director of product management at Dell’s recently acquired Equallogic business. While that might be the equivalent of an airline overselling seats, the critical aspect is that applications only consume physical disk capacity when data is written to the disk – bit by bit.
Demand for thin provisioning, however, has not been as great as some analysts predicted. “The take-up of the last couple of years has been pretty slow actually, and fairly niche,” argues Tony Reid,
One reason, he says, is that until recently it did not have the enthusiastic support of the larger storage vendors [although NetApp might be seen as an exception]. “We see it as just one option within the storage tiers that customers want to fit into their services catalogue rather than a storage entity in its own right,” says Reid. HDS has been offering thin provisioning since the launch of its USP V line in mid 2007.
At IT services company Morse, storage consultant Chris Reid senses enthusiasm building elsewhere. “This is a technology that has seen take-up from the grass roots; it has had success among small and mid-sized businesses and has really come from some of the smaller storage vendors. It has grown with such momentum that the larger vendors have had to take it on,” he says. Over the past 12 months that list has included HP, EMC, Dell and HDS, with IBM talking about a thin provisioning product but yet to announce any firm launch dates.
A pioneer like 3Par sees things differently. “We’ve had the technology out for some time [five years], so certainly our customers – right up to some of the largest banks in the world – are sensitised to the potential,” says Craig Nunes. “From our perspective, it is a ‘must-have’ in data centres, given the amount of over-provisioned and unused disk capacity there is. Getting the job done with maybe two-thirds less disk capacity is something that people need to look into if they are not doing so already.”
Aside from more efficient and greener data centres, a further factor driving thin provisioning interest is the ongoing cost of managing the storage environment. “A large part of those storage management costs is the effort involved in continually allocating more storage to servers. So if you can just do that in one hit, that dramatically reduces the amount of storage administration required,” says HDS’s Reid.
Cautious embrace
But storage buyers – a conservative bunch on the whole – are not readily convinced. A recent survey of 249 storage professionals by storage capacity management software company MonoSphere revealed that 77% viewed the increased risk of running out of storage alongside added management complexity as major obstacles hampering the implementation or expanded deployment of thin provisioning.
Those risk concerns beg the question: where is thin provisioning working well for organisations and where has it been far from optimal?
“Thin provisioning is not for every application, “ says Eric Schott of Dell Equallogic. “However, every administrator probably has some use for it. So their challenge is how it can become a ubiquitous utility but also how it can be turned off where it is not working. A lot of vendors have made thin provisioning in their products a bit of a one-way street. The back-out can be very painful.” Equallogic has architected its product to enable that forced exit.
That is just one consideration that those considering thin provisioning should keep in mind. “There are multiple nuances about the technology that are important to understand,” says 3Par’s Nunes.
When buying thin provisioning for the first time, people should understand two critical aspects, he argues. “Thin provisioning effectively works to provide disk capacity as an application writes to the disk. That on-demand, when-you-need-it model is the source of the utilisation benefits,” he says.
When writing a bit of data (8K would be a typical database write), an efficient thin provisioning implementation will bring in a small amount of capacity (known as the allocation unit) to that thin provisioned volume.
Incredulously, Nunes highlights how some products in the market today stray well away from that principle. “Believe it or not, there are implementations today that, when you write 8K of data, they draw in 5,000 times more than that,” he says.
What appears to be the case is that some file systems do not sit well with thin provisioning. Some implementations, in order to optimise their performance, spread the metadata used to manage the data across the whole volume, filling up the disk much faster than anticipated.
Sun Solaris is mentioned by some observers. Other file systems – Microsoft NTFS is an example several others cite – are also highly inefficient in a thin provisioning environment.
Indeed, anecdotal evidence suggests that some thin provisioning functionality treats the allocation of data being fed from different data and file systems in different ways.
What that all comes down to is a requirement to ensure a much higher level of application awareness, says Chris Reid at Morse. “You need to understand what is happening before it becomes an issue, be proactive rather than reactive. If all the applications need all the data resources at once, you will have a problem. It is monitoring and understanding that before it becomes an issue that is key.”
There must also be some well-understood agreement between the business user and the storage manager over the likely growth peaks and troughs of the application. “If you can’t make the agreement on the growth rates over time, then thin provisioning might be a bad idea,” says Schott, “because if there is a ‘run on the bank’ [i.e. the application suddenly starts needing a lot more storage] then you are likely to run into credit problems.”
The primary exposure in the opinion of Morse’s Reid is performance. “By utilising the devices at a higher utilisation rate, performance may suffer. If you have an application that requires highly performant storage, say in financial services where the transaction processing time is absolutely critical, then thin provisioning is probably not the best place to store that data.
“On the other hand, if it is an email system or some kind of web-based archive, then thin provisioning will provide decent benefits because those sort of systems tend to grow relatively quickly,” he adds.
The key is awareness – at the application and device level. Continual automated monitoring and alerts of allocated, used and physical storage enable administrators to ensure that the appropriate physical storage space is available when it is actually needed, says Compellent.
Those growth monitors have to be sophisticated, as any ‘missed payment’ will likely result in an outage of an application as it has nowhere to write to.
Despite the maturing nature of the technology, it will become much more widely installed over the next couple of years. Says Morse’s Reid, “The best practice around the technology is increasing all the time, and it is becoming a credible alternative. The scepticism around the hype is starting to go away. And we have some key business drivers that are forcing large enterprises to pay more attention to it.”
“The bigger vendors have taken notice and that is getting the attention of more enterprise-level customers. That means that thin provisioning will become a standard option on array offerings going forward,” says Morse’s Chris Reid.
Further reading
Storage pressures Storage priorities are changing: on top of the constant requirement for ever more capacity, there is intense pressure for greater efficiency
Access guaranteed The rate of data growth within the enterprise is forcing some radical rethinking of storage
Find more stories in the Storage Briefing Room