Utility computing aims to solve many of the data centre manager's biggest headaches, such as low utilisation and lack of scalability. But there is one major problem impeding adoption. Neither vendors nor internal IT departments supplying their own business users have been able to determine a charging model for utility computing. "There is no single unit of measurement available," says Una Du Noyer, head of infrastructure and security at IT services giant Capgemini.
|
||
That lack of obvious metrics stems from the nature of utility computing. The architecture provides a virtual pool of processing power that can be ‘provisioned' to different applications at different times – and critically, only in the quantity required. With applications not ‘hosted' on any specific server and often using the processing for a limited time, the standard units of measurement are harder to determine.
"There are a number of emerging models but no one is dominant at the moment," says Du Noyer (see box).
For those IT directors applying utility computing within their own organisation, groundwork is required in policy as well as technology, she counsels. Services should be categorised so that the system is largely self-provisioning. But if these exceed certain parameters, there must be measures in place to capture that information in order to bill for it.
Not all the solutions are yet available. Du Noyer warns that some applications cannot work in a partitioned environment and few software companies have reworked their licensing models to adapt to the new virtualised world.
If companies who have fewer compute-intensive processes use this model to rent out their IT resources, a new type of IT services supplier may be required: a capacity broker, to orchestrate billing and scheduling dynamically.
While uncertainty reigns today, Du Noyer is confident that the billing options will become clear: "We are now where the telcos were when they were moving from fixed line [to fixed-mobile coexistence] – and they cracked it."
|
|||