Last month, 7,000 excited delegates of VMware’s VMworld 2006 user conference crowded into the main auditorium of the Los Angeles Conference centre to hear the company’s co-founder and President deliver her keynote speech. They were not disappointed.
“For years,” said Diane Greene, “our entire industry has been marching to the cadence of the operating system.” By doing so, she said, both hardware and application software innovation has been stifled by the need to stay in step with an increasingly complex and cumbersome set of legacy services which are often redundant and inefficient. But, she promised, that is about to change.
Thanks to a “little, thin layer of software” said Greene “there is now a phenomenal opportunity for our industry to change the status quo. There will be no more arbitrary reasons for purchasing software. Virtualisation will let our customers choose software based on functionality, reliability, performance and price.”
Restoring order
The “little, thin layer of software” that Greene was referring to is the bare-metal ‘hypervisor’ software that VMware introduced to the market as part of its ESX server product it launched in 2001. At the time ESX and its simpler, free-to-download companion product, VMware’s GSX server, represented little more than a speculative attempt to see what demand there might be for a virtual machine server capable of running on commodity Intel x86-based processors. By the standards of most toe-in-the-water marketing exercises, it has been phenomenally successful.
It is still an exaggeration to say that virtualisation has changed the status quo of the computer industry. However, there is plenty of evidence that early adopters of ESX, and the proliferating third party products that have grown up around it, already appear to be delivering in terms of their “functionality, reliability, performance and price”.
Indeed, since ESX’s launch five years ago, server virtualisation has been transformed from a niche activity practised by the high-end customers of a few proprietary mainframe and RISC-based server vendors, into IT strategy that is rapidly acquiring mainstream status. Last year, according to IDC, sales of virtual machine software achieved near 70% growth for the second consecutive year, creating a $580 million market that the research company confidently predicts will more than triple in value – to $1.8 billion – by 2010.
“Virtualisation will let users choose software based on functionality, reliability, performance and price.”
Diane Greene, VMware
However, whether they are big or small, the primary motivation of organisations acquiring virtualisation technology is to realise better value from the hardware resources often, but by no means exclusively, by reducing the physical servers need to run their business.
Certainly, according to analysts such as John Humphreys, IDC’s programme director for enterprise platform research, server consolidation has been the major driver for virtualisation so far, and it is not surprising.
In the immediate aftermath of the dot-com era, when products such as ESX first began to appear in the market, many companies found themselves with huge numbers of application specific ‘commodity’ servers. Widely dispersed and typically utilised to 15% or less of their potential capacity, these sprawling server estates represent a huge waste of resources and an often crippling management burden.
Value for money
Many organisations plagued by such server proliferation are now ready to testify that virtualisation can be a powerful remedy, including Aspen Pharmacare, South Africa’s largest pharmaceuticals manufacturer. Like many other organisations that bear the heavy burden of both complex scientific and administrative systems, Aspen was keen to reduce its costs by centralising and rationalising an applications estate that had spread across dozens of servers situated in numerous different locations, in South Africa and overseas.
“The idea was to try and get more computing power for our Rand,” said Pat Williams, Aspen IT director, “and to reduce our management costs without also having to reduce demand for applications. We thought we could do this by moving multiple applications onto a single server.”
Despite having little familiarity, or knowledge, with virtualisation technology before installing VMware’s ESX server, Williams found that Aspen was quickly able to exceed its expectations.
“We didn’t do a lot of piloting,” he said, but simply started by mounting two virtualised stacks onto one server, increasing this to three, and then four, and even in one instance eight applications on one server. Ultimately, “when we did the cost analysis, we found that you only need average of
two and half applications to realise a return on investment.”
In Aspen’s case, these savings stemmed from better utilisation of the resources it achieved, and reduced management effort required to run fewer physical servers. And these are not the only benefits of a virtualisation-driven consolidation exercise.
During confidential discussions of virtualisation at Information Age reader debates, data centre managers have cited examples of virtualisation being used to increase the application to server ratio by as much as 22:1. In today’s crowded and power hungry data centres such dramatic consolidation produces can be used to both free up valuable server space, and reduce the ratio of power consumed per application.
Indeed, the potential power savings that can be had from virtualisation a single application are such, Pacific Gas & Electricity, the power utility that serves the US’ Pacific West Coast, is currently offering to pay data centre operators $300 – up to a maximum of $4 million per company – for each physical server that is replaced by a virtual alternative.
While server consolidation may currently be the commonest reason for deploying virtualisation software, it is by no means the only one. As Rob Hailstone, director of software platform research at Butler Group points out: “When you look at what virtualisation sets out to do, it’s really a generic approach to increasing the utilisation of the resources that you have, and it does this by abstracting away the properties that make this difficult, leaving you with a pool of resources that you can reuse any number of different and potentially more meaningful ways. Virtualisation has implications across the whole breadth of it,” he says.
Finding examples of uses for virtualisation other than server consolidation is not difficult, particularly as, having cut their teeth on an initial server rationalisation project, many users quickly discover new uses for their latest IT “miracle cure”.
At Aspen Pharmacare, for instance, now that the server estate has been rationalised to a much more manageable size, Williams has turned his attention to using virtualisation techniques to improve the management of those that remain. So far, this has included a recent addition to VMware’s product portfolio, the VMotion application porting toolset.
VMotion exploits the ability for a virtualised application stack (including the operating system) to be “packaged” as a self-contained entity, so that it can then be passed from physical server to physical server with little or no disruption. At Aspen, Williams has used it to migrate live virtualised servers to alternative physical servers, so that the original machine can be taken down for maintenance without interrupting production.
However, the same product, and others like it from companies such as PlateSpin and Opsware, could equally be used to shift applications to bigger or smaller servers according to fluctuating capacity demands. Alternatively, some suppliers are now also using similar techniques to provide improved disaster recovery safeguards, without the need to keep wastefully redundant resources in reserve.
Once users or service providers have grasped the full potential of making applications, and the operating system services that support them, entirely independent of any specific underlying physical platform there are almost endless opportunities for improving on conventional IT delivery systems, including the end-user client.
Hidden costs
With so many impressive examples of virtualisation in action to dazzle them, it would not be surprising if some organisations were to fall into the trap of taking the claims of its proponents at face value, and rush into virtualisation without fully researching the implications. This is, however, an inevitably risky course of action.
Although it is probably true that, as independent voices like Hailstone and Humphreys argue, it is virtualisation, and not grandiose proprietary “autonomic” or “adaptive” systems architectures that hold the key to the realisation of true utility services, there is still much work to be done.
For instance, whilst a healthy technology eco-system is already springing up around pioneers like VMware, there are still yawning holes in the ability of the industry to offer comprehensive virtual systems management services.
Furthermore, the prospect of an ill-managed proliferation of virtual servers should be no more attractive to an IT director than a collection of poorly managed boxes – indeed, the potential for virtual chaos may yet prove to exceed anything the physical world can offer.
And as one IT consultant, who preferred not to be identified, disclosed, not all hardware manufacturers welcome the prospect of higher utilisation rates on their machines.
In some cases, clients discovered that the warrantees on their equipment would be invalidated when used in conjunction with virtualisation technology.
“The manufacturer knew that the boxes were not designed to run above 30% utilisation – any higher and they’d ‘melt’,” said the consultant.
However, potentially the biggest obstacle to full realisation of a virtual world, lies in the realm of licensing. As some virtualisation enthusiasts have already discovered, although a virtual server may be divorced from physical reality, the same isn’t always true of operating system or application licenses.
When Greene referred to the stifling effect of operating systems on innovation, it was no secret that she included in this the unwillingness of Microsoft to give customers carte blanche in their creation of virtual Windows servers.
For many, each virtual instance of Windows will need a licence, although for those companies prepared to pay the cost of Windows Server Enterprise edition, Microsoft has dropped all restrictions on the creation of Windows instances.
Eventually, Microsoft may offer similar terms for its more mainstream Windows Server users, but this may not be something to count on. As server virtualisation spreads throughout the global corporate IT infrastructure, there stands to be many winners, but one of the potential serious losers is a company that is a notoriously tough competitor.
Virtual desktop
While it is easy to assume that virtualisation offers the greatest benefits to data centre managers, it is possible to overlook the game-changing impact it can have on managing the PC estate.
At Voca, the UK-payments agency (formerly known as BACS), the use of virtualisation software from SoftGrid – since acquired by Microsoft – has been used to significantly reduce the effort and complexity involved in supporting 700 highly individual PC users.
SoftGrid’s Softricity software allows Voca to create individual PC software images centrally, and deliver them, on demand, to the end user, regardless of which physical terminal they are using. This approach has provided users with greater freedom and more reliable access to their own data and the applications that they need, says Irene Blaston, Voca’s head of desktop and web infrastructure. But at the same time it improves management control, allowing her to dictate the allocation of desktop resources, including expensive – and now more easily metered – end-user licences.
Blaston spoke of forth-coming changes at Voca: “Next year we are opening a new European office” she says. “Two years ago that would have terrified me. Now I can manage it.”
Further reading in Information Age
VMWare’s disruptive revolution – November 2006
The virtual desktop – May 2006