For anyone charged with building and operating corporate information technology infrastructure, 2009 was a trying year. The economic downturn forced many organisations to shelve development plans in favour of sweating already hard-worked assets to their limits.
That said, it was far from being an entirely bad year for the IT infrastructure sector. Sales of server and desktop systems certainly declined, but the industry continued to deliver innovative infrastructure products and services, and customers continued to invest in their IT fabric. In fact, at a time when private and public sector organisations were intent on making every penny count, a great many of them seemed to believe that the best way to do this was to invest in their IT business infrastructure.
Three years ago this would not have been the case. Back then, the enterprise IT platform was typically composed of widely distributed and poorly utilised servers and their associated ‘stove-pipe’ applications. Money invested in these inefficient and inflexible server estates only added to the problem, and it was commonly cited that upward of 75% of corporate IT budgets was spent maintaining these systems, leaving little to invest in the development of new services to meet changing business needs.
It would be an exaggeration to say that virtualisation has changed that completely, but it has made a profound difference. Server consolidation and the dynamic deployment and redeployment of virtual machines have transformed application management at hundreds of organisations, accelerating IT’s ability to meet changing business needs and reducing costs into the bargain. This year’s Effective IT survey found server virtualisation to be as popular as ever – all but a third of respondents have deployed the technology or plan to do so in the coming year, and 60% of adopters rate it as either effective or very effective.
However, virtualisation’s shortcomings began to surface in 2009. Some of these, such as its confusing impact on software licence terms and the potential threat posed by compromised hypervisor security, have so far had little real impact. But this cannot be said for the issue of virtualisation’s manageability. As some early adopters have discovered to their cost, the freedom from conventional physical constraints that make virtual machines so easy to build, deploy, relocate and duplicate can also make them nightmarishly difficult to manage – not least because they make demands on administrators that conventional systems management tools were never designed to address.
This gap between the complex new requirements of virtual machine management and the ability of systems tools vendors to meet them is now being addressed – albeit slowly. Specialists such as Embotics are tackling the functional shortcomings of conventional tools with products that combine traditional systems management capabilities with features more usually associated with document management and compliance. Meanwhile, BMC, HP, IBM, Symantec et al. are developing or acquiring technologies that promise to extend the reach of their established physical systems management consoles to the virtual world.
The speed with which both these camps can bring virtual machine management in line with physical systems management is dictated to a significant extent by the policies of if the hypervisor vendors – Microsoft, Citrix and VMware. Encouragingly, these companies have all prioritised the development of management extensions to their core virtual machine software, and adopted an open approach to fostering ‘co-opetition’ in the market.
Soon, perhaps as early as this year, the technological barriers to integrated virtual and physical systems management will cease to be significant, leaving IT professionals facing up to the challenge of developing equally well-balanced systems management processes. This may take a little longer, and it may be several more years before virtual and physical IT infrastructures are managed as one seamless environment, but it is a vision that is already starting to become reality in the rapidly expanding world of cloud computing.
Gaining acceptance
Certainly, although it is still far from being part of the enterprise IT mainstream, the proponents of cloud computing can look back on 2009 with some satisfaction.
In the public cloud domain, for instance, the growth of pioneer infrastructure services such as Amazon’s Elastic Compute Cloud (EC2) encouraged a rash of new cloud-based offerings from established managed services players including RackSpace and Savvis. There was a similar surge of interest in cloud-based application service delivery, as software vendors of all kinds rushed to follow in the footsteps of Google and Microsoft with subscription-based services of their own.
Examples of private or hybrid public/ private cloud infrastructure projects also started to become widely discussed (if not actually widely deployed) and, if the marketing rhetoric of infrastructure vendors and conference organisers was anything to go by, a broad consensus emerged in 2009 that the future of IT would certainly be a cloudy one. Maybe not this year, however.
Although it seems certain that ‘cloud’ services and ‘cloud’ technologies will continue to gain market share this year, it is much less clear what these services will be, and how exactly they will be incorporated into the mainstream IT strategies of the organisations that adopt them. The key attraction of cloud computing – the promise of a utility-based IT service delivery model – is well understood and highly desirable. But the means by which this can be most effectively realised is, in the eyes of most potential IT service consumers, shrouded by a plethora of poorly defined and often contradictory products and services.
The message for IT vendors must be that if they expect cloud computing to drive demand for new products and services in 2010, they must do a better job of defining what these products and services do, and how they can be made to work seamlessly together with legacy systems.