Internet 2.0
The Internet is dead, according to a recent statement from a BT representative [see ‘The lawless Internet', Information Age November 2004] – a view echoed by a number of leading technology players including Intel and the Jericho Forum.
There's little doubt that the Internet, as it stands, will be unable to cope with future demands. There won't be one single ‘straw' that will break the World Wide Web's back – it's likely to be a combination of factors such as increased business traffic, continued proliferation of security attacks, the uptake of bandwidth-hungry multimedia applications and the influx of new users from developing countries (not to mention un-checked spamming). One thing is clear: action needs to be taken – but how and when?
Proposals range from building out private networks for business Internet users, to deploying technology to monitor usage and control the flow of information. Whilst all of these ideas have some merit, they're being pushed by vendors who are more interested in selling their technology, rather than creating a solution for the good of all those who have come to rely on the Internet ecosystem.
The Infranet Initiative Council was established last year to achieve the latter. It is working towards the co-operative development of a public network that combines the ubiquitous connectivity of the Internet with the assured quality, performance and security of a private network.
The Council, which includes many major suppliers such as BT, Orange, AOL, IBM, Siemens, Ericsson and Lucent, is engaged in trials for non-proprietary solutions. The council sees an ‘infranet' as neither the public Internet nor a private network infrastructure. Infranets will be built individually by service providers, but will be interconnected to form a global ‘meta-network'. The technology already exists in a large part to make this vision a reality. But industry collaboration is imperative. What is needed now is broad industry agreement on a common platform to deliver this, the next-generation Internet. United we stand, proprietary we fall.
Richard Brandon
Snr marketing director EMEA
Juniper Networks
Hidden cost
I read your article on Lloyds TSB outsourcing 2,500 jobs to India on Information Age's web site (www.infoconomy.com). I'm surprised that LTSB have publicly announced it because there is bound to be a consumer backlash in the UK.
On a personal basis I refuse to give business to organisations that take part in ‘service importing' which is my definition of offshoring. My experience of outsourced service desks is that the service levels are poor and the system does not cope if the questions raised are not within predetermined scripts. The objective of a service desk is to answer calls quickly, because generally support fees are based on the number of calls answered and the time to respond. The customer's need of actually finding a solution comes a poor second and usually involves playing telephone tag with lots of wait queues.
When this process is offshored to people who may have no product knowledge, the effect is even worse.
I've negotiated large outsource contracts and also been involved in reversing out of old outsourcing contracts, but throughout the process I cannot lose sight of the thought that the company providing the outsourcing is doing so to make a profit. In most contracts the [large suppliers] of this world look to break even during the first 18 months or so but look to be making good profits thereafter. I'm convinced that management teams outsource only the parts of the business that they have been unable to manage in the past. A more cost effective solution might be to get rid of the inefficient managers!
Service importing/offshoring has the ‘advantage' that the corporations avoid paying social and employment taxes, and pension funding in the client countries. So in effect the corporation avoids its social responsibilities.
The government should level the playing field and charge a service importation tax. In that way there would be a more fair cost comparison.
Charles Smith
Oaksys Tech
Capture to comply
I read with interest the piece in November's Information Age Advisory Series on Business Intelligence (BI) on 'Dynamic compliance' that highlighted Basel II, Sarbanes-Oxley and the essential corporate data that supports governance and transparency. The liability of CEOs and CFOs for any governance shortfalls and the competitive advantage as opposed to challenges to the business perspectives are no doubt effective 'good-cop-bad-cop' persuasion. As [industry analyst group] Gartner says, compliance with efficiency and faster decision-making is key – compliance with agility and top-line performance being CEO/CFO nirvana.
Yet it appears the essential BI building blocks of 'qualitative data' and 'procedural efficiencies', which create the conditions for informed decisions, are either being ignored or, worse still, are unknown factors.
A further compliance consideration has also recently emerged. Basel II and Sarbanes-Oxley require each financial organisation to report on qualitative data. This is causing considerable concern as the regulation is unsurprisingly quite vague. However, its intent is to reduce the risk of qualitative activities such as in anti-money laundering and miss-selling.
The ability to capture every nano-decision taken whilst following procedures provides the necessary interactional data which we believe will meet the challenges of new regulations. Every nano-decision taken, captured as a real-time audit, thus provides unprecedented capabilities for financial organisations to handle agility and qualitative regulatory reporting. From an operational risk perspective, some of our clients now believe this type of qualitative data, if based on high-quality procedures, will also have a positive impact upon capital adequacy.
In a recent article, 'The CFO's Dilemma', Mike Schonhut, editor of Better Management observes: "The complex web of regulatory legislation continues to grow and the costs of compliance are certainly not trivial… Since this money has to be spent, some forward-thinking companies are looking at leveraging the resources this buys in other areas of the business. Their intent is to mitigate the cost of compliance by funding accelerated processes or making improvements [in planning or customer relationships]."
In service-based organisations, typically 80% of the business handled through lean processes represents only 20% of the costs. However, the other 20% of the business now involves 80% of the costs and the biggest risk for non-compliance exposure. Procedures are the organisational instruction set that determine workforce agility, adaptability and compliance. It is for these reasons that businesses need to capture their qualitative data and focus on procedural efficiency and effectiveness.
Simon Crosbie
Chief executive
Optimal Partnerships (UK)
Power rating
As designers of data centres and data centre air conditioning we would just like to point out a major inaccuracy in your article 'The Melting Data Centre' (Information Age November 2004). In simple design terms, the power consumed in a data centre is 100% turned into heat, so a cabinet using 2.5kws of power generates 2.5kws of heat. It is electric motors that only turn half the power to heat. Such lack of understanding is the reason data centre managers have the problem that you are writing about.
Mark Allingham
Managing director
Comms Room Services
Mobile blind spot
From the Blackberry to the Connect card there is currently a lack of knowledge of mobile devices within traditional IT departments, which means support is often backed-off to the mobile network operator – a model wholly inconsistent with traditional IT support.
It is estimated that 75% of mobile computing support calls are escalated to the network operator, yet fewer than 20% actually relate to network problems – which means users continue to struggle with general support issues such as lost email and confused calendars.
Users of mobile devices regard them as an extension of the wired network and therefore need and expect the same level of support – so escalating calls to the network operator should be a matter of last, not first, resort. The unprecedented overlap between IT and telephony means it will be some time before this specific support expertise is widely available. However, a viable short-term option is co-sourcing.
A co-sourcer provides its own permanent staff to carry out first, second and third line support as well as providing training and help with initial set up. This gives companies access to the mobile computing skills they need alongside their existing in-house IT support function.
Today, mobile computing is in its infancy but it is rapidly becoming an integral component of a successful business. As such, jeopardising the quality of service through ineffectual support is surely unacceptable.
Paul Whitlock
Service management consultant
Plan-Net
Dirty data
In your article, 'MFI's supply chain woes' [Information Age October 2004], I noticed that you refer to five key issues contributing to software failures, whereas in the article there are only four bullet points.
I am making the presumption that the missing bullet point would refer to the cleanliness and integrity of the data underlying the software application.
From personal experience of migrating to a SAP ERP system at the end of the 1990s, the vast majority of the problems encountered by our customers were related to dirty, incomplete or totally incorrect data. The other problems encountered usually related to misunderstandings of business processes which resulted in incorrect configuration of the software.
Sid Seton
Data analyst & archivist
Company name supplied
Microsoft exposure
Battening down the hatches – configuring firewalls, anti-virus software, anti-spam and web content filtering software, and installing security patch after security patch until they are coming out of your ears – is this really what protecting your corporate identity should be about?
Ironically, a fear of technology has sent thousands of companies into the arms of an IT vendor that has some of the most complex, resource hungry and ultimately insecure server technology on the market: Microsoft.
Worse, that same fear has resulted in organisations eschewing the accepted reliability and low cost of Linux due to its 'tech-head' reputation. Yet packaged Linux server solutions requiring minimal expertise can support hundreds of users at a fraction of the cost of a Microsoft option and are secure by the very nature of their design – straight out of the box.
If organisations want to achieve the dream of an invisible, reliable box that provides the basic business tools such as email, Internet access and file sharing but without the unnecessary exposure to hackers, they need to stop following the crowd and opt for an IT solution designed to work quietly and reliably in the background.
So next time your box gets spammed or another virus stops play for half a day, remember: grumbling or kicking the server is not the answer. The solution is to sever the Microsoft addiction and venture out into the thinking person's world of reliable systems and affordable solutions that can be seen but not heard.
Malcolm Cartledge
Director
Kyzo
Social engineering
RE: Letter from John Stewart, CEO, Signify, November 2004. I disagree with John's point that businesses need to spend even more money on advanced IT solutions to authenticate users. We know where he's heading: lots more costly and buggy IT.
The real problem is "social engineering" and users who don't understand the value of their organisation's or customers' information. We do lots of security and data protection training and consultancy and it's frightening that once you get down below the managers to the people processing the data, most users have not been inducted, trained, and managed to appreciate the risk to and value of their data. Culture is key but sadly it's easier and sexier to buy a new IT system than train staff in "soft skills" like security.
Martyn Cattermole
Director
Assetz Consulting Ltd