|
|||||||
The languid, post-bubble state of the IT market belies a foundation-shifting movement well underway.
Grid computing and related technologies promise to replace the existing architectural status quo and may be quietly shaking the industry out of its complacency. Grid now counts all major systems vendors, and a healthy and growing number of start-ups, as proponents. It is gaining intellectual acceptance in boardrooms not for its headline-making scientific applications – of which there have been many – but because of one very simple truth: It makes better use of computing power that is already in place. If businesses and service providers both feel that they over indulged in the 1990s, then grid offers a tonic of more efficient resource utilisation.
Grid computing uses peer-to-peer (P2P) networking, systems management and middleware technology to let applications share available processing power regardless of physical location. It implies comfortable evolution rather than disconcerting – and potentially expensive – revolution. The concept gains validity when put in the context of surrounding technological developments. Both the Napster-fueled interest in Internet-enabled P2P computing and the more prosaic development of space-saving blade servers for data centre applications tie-in to the grid evolution. Grid is also a pillar of grand, industry-driving concepts, such as self-healing, autonomic systems and network management schemes.
"Grid is the third phase of Internet computing," says Mike Nelson, director of Internet technology and strategy for IBM. "The first was communications, including email and remote file transfers; the second was the web, which was content-oriented and a one-to-many model; and the third is many-to-many, which we were all introduced to by Napster," says Nelson.
Most convincingly, the technology to deploy grid exists and is proven in practice. But grid applications to date have been scientific and in closed communities of interest (with the notable exception of several high-profile cases). Its commercial potential is the subject of much debate, and the obstacles to wide-scale deployment are formidable.
The resource-sharing concept behind grid is, in truth, as old as computer networking. The idea is to use all possible available central processing unit (CPU) cycles to accomplish tasks, rather than limiting application access to a single system. According to systems vendor Sun Microsystems, CPU utilisation on its workstations rarely exceeds 20% of available capacity. Indeed, utilisation rarely exceeds 40% even with server applications. Sun claims that the deployment of its 'Grid Engine' software in workgroup environments drives utilisation of available resources up to over 90 per cent. These figures are generally agreed upon within the industry.
"IT managers base their modelling on peak demand, so silos of resources are over-provisioned," says Rick Hayes-Roth, chief technology officer of Hewlett-Packard's Software Solutions group. Grid deployment solves that problem, saving money and using the processing power delivered by Moore's Law more efficiently.
The concept, therefore, sits nicely with the corporate IT belt tightening prevalent today. It also fits in with the notion that there is a massive over-supply of telecommunications bandwidth – both within corporate firewalls and between them. In that way, the major contributors to the technology market slowdown – over-investment in enterprise infrastructure and telecoms networks – can be regarded as selling points for grid.
But efficiency and related cost savings represent just one side of the grid rationale. The other is that the computation power it enables creates the opportunity to accomplish unprecedented tasks. It is this vision that drove the early grid thinkers.
Work on what is now known as grid dates back to the 1970s. Like the Internet itself, grid research derives from efforts funded by governmental organisations. The European Commission was one of grid's earliest patrons, according to Wolfgang Gentzsch, Sun's director of grid computing. It has its roots in the study of distributed resource management and high-performance computing (HPC). The applications were then – and many still are – project-based, where communities of interest, usually scientific and among universities, needed to pool resources toward a common goal.
|
||
Grid projects were overlooked by much of the wider IT community for many years, particularly after client-server computing models gained support during the 1980s. But highly publicised grid applications have emerged in recent years. Notable examples include DataGrid and EUROGRID projects, which were both funded by Brussels. Even more celebrated are the global FightAIDS@home and SETI [Search for Extraterrestial Intelligence]@home projects.
Grid interest accelerated among academics and systems vendors after the delivery of the 'Globus Toolkit' in the late 1990s. Developed by the non-profit Globus Project of the Argonne National Laboratory and the University of Southern California's Information Sciences Institute, the toolkit is public-domain software that offers a set of security, resource directory, resource management, data management and communications functions required to implement grid.
One of the most alluring aspects of grid is that, while a radical departure in some ways, it encompasses concepts familiar to technologists. It is, for example, distributed computing at its heart – something most IT organizations have been deploying for a decade or more. It also takes advantage of server clustering and load balancing, and forces the issue on policy-based management techniques.
Market opportunities
The first step for major systems and software players will be to focus on departmental applications for their commercial marketing efforts. Although the terminology used changes from company to company, there is agreement on three real and potential grid markets: enterprise or campus, communities of interest and global.
Enterprise grids are generally viewed as providing a genuine market opportunity today. The resource-sharing technology exists and the oft-stated barriers to grid deployment – security concerns and platform heterogeneity – are more easily overcome when the application resides within a company's firewall.
The benefits of using grid computing for corporate applications are many and varied. First, businesses can get more out of the computers in place. This can, of course, yield financial savings. But it also can yield competitive advantages. In the cut-throat pharmaceuticals industry, for example, companies are deploying grid to dramatically reduce the time it takes to do molecular modelling. By dynamically pooling all available computing resources, these companies can reduce time-to-market by many months. In addition, grid can enable applications – and even businesses – that otherwise would be untenable without access to costly supercomputing resources.
The advantages of grid computing for enterprise IT are also far reaching, say those selling the systems. The capability to effectively share resources and manage entire corporate systems with a single view simplifies the process of achieving IT nirvana: truly matching technology to business objectives by using policies. If, for example, a life sciences company is beaten to market in one area of genetic research, it can quickly shift grid-enabled computing resources to other, more promising projects by adjusting policies. The risk of wasted IT capital is potentially eliminated.
There is nearly universal agreement that this is a viable present-day market. "The technology is absolutely in place today," says Ian Baird, chief business architect of Canada's Platform Computing, a grid systems company that has been selling distributed computing software into the corporate market since 1992.
But there is disagreement on applicability. Conservatives contend that the commercial grid market is really confined to departmental clusters or virtual workgroups, rather than complete enterprises. "At the departmental level, there are no obstacles," says Sun's Gentzsch. "At the enterprise level the obstacles are not technological, but instead [relate to] the complexity of the IT environment. There's a lot of work to be done there."
Indeed, the heterogeneous nature of most corporate environments demands a standards-based approach to grid, which does not yet exist. It is believed, however, that the widespread adoption of Internet Protocol (IP) and the Microsoft Windows desktop operating systems have helped reduce complexity. And in February of 2002 an IBM-led effort called the Open Grid Services Architecture (OGSA) was established, and much of the industry has endorsed it.
In addition to establishing standards for application-layer interoperability, the OGSA also addresses concerns about grid's co-existence with 'web services' architecture such as Microsoft's .NET, Sun's Sun ONE and IBM's WebSphere platforms. Web services technology allows business transactions to run on servers scattered over the Internet, and grid distributes and co-ordinates application processing across many systems. The two, therefore, should be complementary, not competitive. For that to happen, however, an integrated development approach from the ground up is required.
The next phase will be extending grid beyond the corporate firewall. That will be a big step, say experts. Universities in Europe and elsewhere are already using grid technology to benefit research projects that require vast processing power. Strong arguments are also made that like-minded but separate businesses in, for example, the aerospace industry could benefit immediately from sharing computing resources.
But security concerns and policy administration are seen as profound obstacles to a lucrative market any time soon. For a technology market to blossom fully, say veterans, it must become meaningful beyond high-end, scientific applications. Grid will have to penetrate the financial sector, for example – an industry where security concerns over grid's resource-sharing mechanisms are likely to form the primary focus of debate. "The notion of money transfer and risk will greatly affect the commercial market for grid," says Vernon Turner, group vice president of server systems at IDC, the market research group. He says that a high-street bank and a top-tier insurance institution may have every reason to co-ordinate work on complex transactions using a grid – but such collaborative projects may be limited by security concerns.
Industry hype
Marketing hyperbole reaches its peak when proponents discuss grid computing in its global context. Grid technology enables 'utility computing', as IBM refers to it. Rather than building and managing private IT systems, companies will be able to use a third-party grid to access computing cycles, data storage and applications logic on demand. "Enterprises have all the reasons to adopt utility computing," says HP's Hayes-Roth, "and economies of scale will dictate [third-party] data centres."
The same principle could be applied to the consumer market. There is a rare consensus among the major systems vendors on this future for grid and the computing world. But at same time, these same companies are quick to temper their enthusiasm by addressing one simple reality: This is not going to happen tomorrow.
"Service grids are years away from development," says Platform's Baird. Peter Jeffcock, Sun's grid computing group marketing manager, is even more blunt: "We don't see a lot of demand for global grid in the near term."
Nevertheless, each of the major players – IBM, Sun, HP, Compaq, as well as Platform and a growing number of start-ups – do not hesitate to characterise the aggregate, long-term market opportunity for grid computing as 'huge'. But specific numbers on the eventual size of the market are difficult to come by. The companies are equally as vague about timing. Dave Fish, president and CEO of Avaki, a venture capital-funded grid software start-up based in Massachusetts, is one of those willing to go out on a limb. "The huge market will be in the 2005-2010 timeframe," he predicts.
Ultimately, he says, the commoditisation of high-performance, high-density processing platforms makes it "inevitable" that grid's influence will extend beyond specialised scientific applications.
That helps to explain why systems vendors are pushing grid. After all, the eventual, large-scale adoption of grid seems to run contrary to their business models. "We're pursuing a strategy that foresees the commodisation of computing resources," HP's Hayes-Roth admits. "Grid will reduce the aggregate demand worldwide for boxes. But we see this as inevitable, so we embrace it."
|
|||
|
|||