History, as doubtless any student of the computer industry will point out, tends to go in cycles.
When interactive computing took off in the 1970s and 1980s, the predominant model – indeed the only model – was centralised processing, with mainframes (and later minicomputers) sitting at the centre. Employees communicated with the system through ‘dumb’ terminals that were little more than a means of displaying characters (in block mode) and passing these blocks to and from the processing engine.
There were obvious shortcomings to that model – frustrations that were highlighted when the PC came along in the early 1980s and provided users with local processing power. That autonomy heralded the development of client-server computing, in which most applications logic was held not centrally but run locally on the desktop.
It took half a decade of rollouts for the inefficiencies of that model to become clear. Spreading all that computing power throughout an organisation just made it harder to fix things when they went wrong, more difficult to ensure everyone used the same software, more tempting for would-be thieves, and easier for security breaches to occur. With the advent of Internet computing in the late 1990s and the possibility of delivering applications over the web to a browser, the PC’s function looked increasingly over-engineered, and calls went up that it should be replaced by something much less costly to buy and maintain – in a phrase, the network computer or the thin-client device.
The argument did not win over everyone immediately. Even now, half a decade later, sales of thin client computers are still far outweighed by server-attached PCs.
But in recent times, demand for thin clients has started to surge. According to IDC research, the conventional PC market grew only 1.4% in 2002, but the thin client market expanded by 19% – albeit on a total number of units shipped worldwide of only about two million. (Almost 150 million PCs were shipped in 2002.) An even bigger growth rate is expected for 2003, say analysts.
As customers roll the devices out in greater numbers, suppliers are making their move. “It’s a very big opportunity for us,” says Shaun Hobbs, Hewlett-Packard’s (HP’s) UK head of commercial desktops. HP had originally partnered with Wyse, the biggest seller of thin client computers, but after seeing the market figures – and the forecasts – it decided to develop its own range. “It’s a hardware revenue stream for HP, which is something we haven’t focused on this hard for a couple of years,” he says.
HP is not alone in renewing its interest in thin clients. After past attempts stalled, Sun Microsystems has upgraded its Sun Ray thin client systems and is throwing substantial sums behind a sales and marketing drive. It is also, in Silicon Valley parlance, eating its own dog food. Rolling out thousands of the $700 machines, which come with Sun’s desktop software, has already cut millions of dollars off Sun’s IT costs, says CEO Scott McNealy. The company has also saved millions by reducing power consumption “by not having a Wintel ‘space heater’ in each of our offices,” he says.
The intensified push by some of the world’s biggest computer makers underlines how far the market has come. Alongside them, the pioneers of the market – including Wyse, Neoware and VXL – remain powerful, say analysts. Equally, Microsoft, which develops two of the most popular operating systems for thin clients (CE.Net and Windows XP Embedded), has begun a marketing campaign based around the services for thin clients available in Windows 2003 Server.
|
“There’s great demand,” says Mark Tennant, Microsoft’s Windows servers product marketing manager. “With IT budgets being reduced while the demands placed on IT departments are increasing, many customers are looking for a way to have a centralised, secure, managed solution.”
These budgetary pressures have started to make a new generation of thin clients attractive even though the difference in price between a low-end PC and a thin client has narrowed.
“The cheapest thin client we make is £189,” says Stephen Yen, general manager, EMEA of Wyse. “If you were a retailer rolling out shops nationwide, the cost of sending and loading a PC with software and sending a technician to install it would probably be worth the same price as the thin client, which you can just ship in a box.” The total cost of ownership of a PC is even higher than a thin client’s because a PC needs more electricity and has more moving parts, which are prone to break and generate more heat. The result, says Yen, is a total saving of between 25% and 30% when thin client computers are installed.
But while thin clients have always offered a strong ‘total cost of ownership’ story, what is making them more appealing today is that the difference in functionality between a thin client and a standard PC has narrowed significantly. Once regarded as dumb terminals, thin clients now support graphics and many of the same capabilities as a PC, except for hard drives. Wyse and VXL have differentiated their products by improving specifications and manageability while maintaining the inherent advantages for enterprises of thin clients.
Like their PC cousins, some thin client machines use faster microprocessors, others have a PCI slot for wireless networking and more still have flash memory so that applications can be installed locally if necessary. Some thin clients, including those made by Wyse, come with their own management software, while HP partners with Altiris for deployment software.
One key part of the growing acceptance of the thin client is the increasing reliability of today’s servers. Falling costs of back-up technology has also helped.
As a result, the National Health Service and other normally risk-averse organisations in the UK are regarding thin clients as safer investments. Until recently, relatively small implementations were the norm. But the biggest growth today is coming from customers wanting to kit out thousands and even tens of thousands of users, says Lewis Gee, UK managing director of Citrix Systems, which developed the thin client capabilities that Microsoft incorporated into its server software. And these projects are not just in call centres and point-of-sale machines, either, but include many enterprise-wide schemes.
However, many CIOs have still to be won over. Others, such as Martin Ellison, the head of IT for Britannia Building Society, would like to roll out thin clients but are held back by other factors. “We have 3,000 desktops. We’ve standardised in various areas, but there are a number of PCs running different combinations of applications. All that creates complexity – even with co-existence testing to see which applications work well together on the desktop – and we get a high incidence of errors that lead to support calls and fault-fixing activity. Then there’s the whole complexity around software distribution. We’d make a big inroad into that with thin clients.”
Ellison has been looking at implementing thin clients for the last seven years or so, but despite his best efforts, he still cannot roll out the technology because the suppliers of some of his key applications are still not accessible through a browser. “I believe in the business case for thin client, but it’s just a matter of convincing software vendors to move in that direction.”
As Ellison has discovered, most thin client software problems stem from legacy applications or poor development techniques, rather than fundamental problems with thin client architecture. But even newer applications do not always come with a thin client version.
A number of industry initiatives are addressing this difficulty, however. Citrix, for example, has established a network of almost 200 developers to ensure that as many applications as possible run as well as possible in a thin client environment. Microsoft has issued guidelines to developers about how to write “great terminal server applications,” says Tennant.
Such commitment signals that thin client computing may now be ready for the corporate mainstream.
Thin Client Architecture in practice: the National Blood Service
Blood shortages are a constant problem for the National Blood Service (NBS), part of the National Health Service. And technology issues have not always helped it address the challenge.
Before 2000, the organisation’s PCs were of inconsistent quality, support costs were high, the pressure for more modern applications was intense and new rollouts were perceived as costly, time consuming and lacking in any guarantee of a successful implementation.
A thin client architecture was chosen as the alternative. “Thin client devices gave us the option of running legacy applications in a more manageable, cost-effective environment,” says Neil Hogg, general manager of IT at NBS. “We felt that Windows-based terminals were more than capable of working effectively in our environment.”
A big challenge was getting users comfortable with the new machines, says Kevin Cartwright, programme manager for IT and facilities at NBS. “When the initial change to a server-based computing solution was proposed, people were unsure about the value it would bring,” he says. “We spent months at the beginning preparing documentation and getting the PR right. I went round senior management groups and explained the technology. If necessary, we would go round to individual computer users.”
In December 2003, the rollout of 2,500 Wyse terminals at about 80 locations was completed.
The systems are managed from two national data centres about eight miles apart, connected together by high-speed fibre links to provide a single virtual data centre. The benefits have been considerable, says Hogg. “The combination of architecture and management software has saved us a huge number of man hours since the initial implementation,” he says. Support is easier, with ‘fix times’ in hours rather than days as before, and far fewer support staff are required.
“We now have customers demanding to be ‘thinned’. The take-up has been exponential,” says Hogg.