Grid technology, in theory, promises to revolutionise computing. By being able to power their applications using a pool of computers spread out across a loosely connected network, even modest users should be able access unprecedented processing power – on demand.
Thus far, however, grid has mainly been used by the science community, where an emphasis on collaboration and a tendency towards compute-intensive workloads has made grid a useful means of getting supercomputer power for a fraction of the cost. Those early applications have not helped many commercial companies make a strong business case for grid, not least of all because applications often have to be rewritten to run over a grid.
Additionally, massive-scale projects such as the construction of the Large Hadron Collider (LHC) at the CERN laboratory in Geneva, which will be the fastest particle accelerator in the world, has meant the scientific community has been forced to turn to grid architectures to achieve its aims. The LHC project requires a grid of 100,000 high-end PCs to process data every hour of every day for two years. Its storage requirements are mind-boggling: 12-14 petabytes stored annually – the equivalent of a stack of data-packed CDs 20 km high.
Such programmes, while impressive, are only of passing interest to those looking for clear business benefits of investing in a grid architecture. According to Dr Mark Parsons, the commercial director of the Edinburgh Parallel Computing Centre (EPCC), the main problem is that "the whole domain is driven by technological rather than business needs".
|
||
Over the last five years, the promise of commercial grids have been couched in terms of a more efficient use of hardware resources; a low-cost alternative to high-end servers; a means of accessing processing power when and in the quantity required; and greater ability to respond to rapidly changing business conditions.
p> The problem, however, is that there is not a clear roadmap of grid's future. The idea of a grid that fosters intra-IT system collaboration and data sharing is anathema to many businesses – and a boon to others. Parsons gives the example of financial services companies which require tight control over their data to ensure regulatory compliance; at the opposite extreme, car maker Audi desperately wants to set up collaboration networks with its partners and suppliers.
In addition, most commercial grid software is nowhere near production quality. "Most vendors have some sort of grid story, but it is in its infancy and few understand how they are actually going to make money from it," says Parsons. "No one is doing data grids, and currently the community can't agree on a common grid middleware platform. It's very damaging."
That grounded assessment aside, Parsons is highly enthusiastic about grid's potential to give businesses greater control over their data. Today, many companies store their data in multiple databases using multiple formats. The distributed data mining system – Open Grid Services Architecture Data Access and Integration, which goes by the unwieldy acronym OGSA-DAI – allows them to leave the data in situ and still be able to query the database without losing control.
Parsons looks forward to a point where grid is so ubiquitous that it synonymous with computing. That may not be in this decade, but it could well be in the next.