In recent years, market watchers have expended significant energy and effort debating the IT function’s transition to a ‘real-time infrastructure’ (RTI). It is widely agreed that this is a desirable goal, and some of today’s IT decision makers will have already started plotting their strategies for achieving an RTI.
But for a significant number, it is far from clear how to move from today’s infrastructure chaos to a more agile and responsive IT department of tomorrow. This was the challenge examined in Information Age’s most recent webinar, sponsored by IT services company Unisys.
The arguments for moving towards a real-time infrastructure are reflective of the need for IT to deliver better value to the business. Currently, within most organisations, the IT infrastructure has evolved to cope with peak business demands; but this is inefficient – and when demands change, IT can be painfully slow to respond.
There are probably as many definitions of what an RTI might look like as there are predictions about what technologies will shape it. But Andrew Butler, an analyst at IT advisory group Gartner pictures it simply: RTI is the transformation that shifts the IT function from being viewed as a cost centre to a profit centre.
Real time roadblocks
Of course, such neat descriptions hide a tumultuous process, as IT builds “pooled resources” capable or responding instantly to organisational demands. This vision of IT on demand cannot simply appear overnight, but needs a careful, staged progression, explains Butler. Part of this will inevitably require greater use of virtualisation technology, and greater use of modular components, such as blade servers, he adds.
But the increasing popularity of blade servers is already highlighting some of the difficulties that IT management will face. Currently the blade market “lacks standards”, limiting the deployment of blades from different vendors, says Butler.
Furthermore, the transition to a utility model of computing, necessary to deliver a responsive IT service to the business, is complicated by further disagreements over how software is licensed.
It remains unclear how software vendors will react to the changing nature of IT delivery, but already they are divided on issues such as multi-core processors. Intel believes that pricing based on a ‘per CPU’ model is proven to work, says Arun Shenoy, director of Intel’ digital enterprise for the UK & Ireland. “But multi-core [processing] brings complexity to that.”
The software industry is split on whether a processor with multiple cores should be treated as a single or multiple processors, he adds.
Virtualisation technology, which is being deployed to improve server utilisation, adds to the complexity. By removing the explicit link between the software and the hardware, virtualisation allows applications to be run over multiple servers, reducing the impact of a server failure, and potentially improving workload management. But it also begs the question of how to charge for the software, which may now be deployed on numerous servers, says Shenoy.
Alongside these difficulties today’s IT decision makers also face problems because of the sheer complexity of the IT organisation. Many large companies will have a mixture of in-house IT, managed, outsourced, and potentially offshored services.
Getting these to act as one single pool of IT resources is beyond today’s management technologies, concedes Simon Shiach, global vice president for RTI marketing at Unisys. But “there is no big bang” approach that will work for the real-time infrastructure, he says. “It’s about taking incremental steps forward: looking at how you can optimise the current infrastructure,” he adds.
• For more detail, please visit <a href="https://www.information-age.com//webinars/rti">www.information-age.com/webinars/rti</a>