It is Orlando, Florida, and a credit card holder makes a routine purchase over the phone. The payment goes through without a query. A few minutes later, the same cardholder attempts to make another purchase. This time, he is trying from Taipei, Taiwan. What does the automated credit approval system do?
What it should do is combine the two events and create a “possible fraud event”. That message will alert the authorisation system and any other systems that need to know.
|
||
And it will happen so fast that, for the second purchaser, at least, the transaction will be stopped.
But that will probably not happen today. Credit card companies have all kinds of systems for recognising and tracking fraud, but most can only react to a very limited number of events in real time. They don’t have what is being called an ‘event-driven architecture’.
But they soon will have, if the IT analyst group Gartner is correct in its predictions. EDA, says Gartner analyst David McCoy, who cited the above example at the company’s recent annual integration and web services event, is “the next big thing”.
By 2008, he and his fellow analysts think event processing will be mainstream, with most new business systems in large companies set up to emit vast amounts of event information. “Applications are going to start to get very chatty,” says McCoy.
And it is not just applications. Leading thinkers such as Nicholas Negroponte of MIT, and Glover Ferguson, the chief technology officer of Accenture, have forecast that billions of RFID chips, remote sensors, and even a whole world of virtual objects and ‘avatars’ will soon start bombarding their monitoring systems with their latest news.
That kind of information is valuable, but only to those who have set up an IT architecture that is flexible and powerful enough to use it. As ever, the arrival of this new acronym comes with an imperative: those that use EDA will gain important financial and strategic benefits, in terms of agility, shortened process time, simplicity and speed of reaction. The rest will risk obsolescence.
The imminent arrival of the EDA has analyst groups and suppliers scrambling to work out the implications. Gartner has made EDA a central theme in its strategic advice and predicts a huge hype wave is about break: “This will be the subject of conferences in the next few years. This will be the cover theme of magazines,” said McCoy. (It already is).
Meanwhile suppliers are already working on their systems, and marketing departments on their angles. IBM has developed a set of standards and procedures called ‘Common Event Infrastructure’ has even submitted some standards to the Oasis standards body and is promising new EDA products later this year (see box ‘EDA: A technology primer’). Tibco, whose publish/subscribe integration systems are partially event driven already, is expected to deliver an event management product soon; Oracle has introduced an ‘event infrastructure’ to handle large numbers of RFID messages. Dozens of start ups are working on specialist products, and system management companies such as BMC and Computer Associates are busy working out how their event-handling systems management products can be extended to the application level, especially in areas such as business service management.
In fact, there is so much event-driven technology out there already, both in the labs and working at customer’s premises (see box) that EDA seems scarcely to justify the soubriquet of “the next big thing”. And, moreover, “the technical prerequisites are already there in most large enterprises. Any company of $250 million [per annum revenue] and up has an IT infrastructure that can support EDA” says Roy Schulte, the Gartner analyst who is credited with developing much of the thinking on EDA.
But some technology changes will be required – as will, for most organisations, a lot of new architectural and strategic business thinking.
“Today, all enterprises are event-driven to some extent already, but mostly implicitly and without real thought,” says Schulte. An event driven enterprise, he says, “has a deliberate strategy of employing [systems] design concepts that allow it to respond to as many different events as possible.”
Businesses should rebuild their systems based on a consciously event-driven architecture, rather than use ad hoc, tactical systems, he says.
On demand
If the technical leap to EDA is not huge, then neither are the practical differences between an event-driven enterprise and a traditional business necessarily very dramatic. But the impact can be huge.
To illustrate the point, Schulte cites the difference between a manufacturing company that builds to inventory, and one that builds to order. The latter is like an event-driven company.
“Companies like Dell can manufacture to order within a 24-hour cycle. They save a lot of money by not having lots of products in their inventory,” says Schulte.
Schulte believes that similar event-driven processes are beginning to emerge right across the commercial spectrum. “We didn’t just think of this [EDA] ourselves. It is based on what some of our leading edge clients tell us they are doing,” says Schulte.
Some of these applications – such as programmed trading, which began on Wall Street in the early 1990s – are well known. But the key change that Gartner and others advocate is for event handling to be open and standard based, so that different systems can send event information to each other as standardised messages.
But what are the architectural changes that need to be made? The consensus among software suppliers is that these will not need to be dramatic, although some suppliers will have to make major changes to their systems.
A key point emphasised by those involved in this area – especially in the all important area of middleware – is that the EDA is the sibling of the SOA (service-oriented architecture). The idea of the SOA is to deliver software functions as loosely-linked services that can be plugged, unplugged or combined to form new applications. The idea of the EDA is that organisations can respond instantly to any relevant event.
“The best use of SOA comes with use of, and understanding of, EDA. These two platforms together form the basis of the real-time, event-driven enterprise,” says Gartner analyst, Nefim Yatis.
In fact, the infrastructure needed to support EDA includes the use of web services as a key set of communications and interface standards, along with the orchestration and management capabilities of integrating middleware products. Integration suites and business process management software tools are also likely to play a key role in handling the plethora of intra-process messages that event-driven working generates.
That is not to say it is all in place. Although, as Ross Nathan, the CTO of SeeBeyond points out, “Standard middleware products have been doing EDA since it started,” there are still likely to be improvements needed in events handling, complex events management, workflow and state handling, and in the important area of rules engines (to program systems to automatically handle complex events).
There is also some work to be done on standards. SOA architectures today, using web services, work through synchronous conversations – in which a service is requested and it gets a reply. EDA throws out large numbers of asynchronous messages that demand no reply. Web services standards to deal with this have yet to be fully specified.
The design and configuration of EDA systems will also need to be different – because polling systems for information, as most middleware products do now – creates too much traffic. “To keep the traffic down, the application will have to tell us something has happened,” said Nathan. Either way, a robust and capacious network will be needed. The only truly exotic element in the EDA prescription is the use of complex event processing (CEP) – the pattern recognition and resolution technology which enables a wide variety of different events to be interpreted, and translated into an equally diverse set of responses.
With the exception of ‘business activity monitoring’ and some specialist applications, Gartner is advising most clients to be wary of CEP at present, owing to product and infrastructural immaturity.
CEP’s roots lie in the defence industry, where it was developed to provide missile guidance systems with the ability to distinguish between friend and foe in real-time. Since then it has seeped into commercial use at the heart of real-time financial trading systems and into some network management products.
More recently though, CEP technology has begun to be productised by specialist developers who are applying it to a variety of commercial applications, and surrounding it with tools that make it much more accessible to business users.
In the UK for instance, Apama’s Event Modeller provides financial traders with a dashboard which allows them to set and reset the rules that their electronic systems use to interpret and respond to real-time market feeds. In the US, Metatomix’s SMARTE system integrates federal and local government intelligence systems, providing a joined-up, real-time of essential security and surveillance information.
It is the adoption of EDA and CEP by the bigger suppliers, however, that will really establish the technology. Two suppliers stand out – Tibco, with the evolution of its publish/subscribe systems, and IBM.
Jason Weisser, IBM’s corporate VP of enterprise integration, says his company’s on-demand operating environment, akin to Gartner’s concept of EDA, “is predicated on the use of SOA as the adaption layer” between business processes and a set of loosely coupled systems services, linked to a complex event processing engine that controls the distribution of these services between the competing processes.
IBM’s CEP engine, presently code-named Whitewater, is set to appear late in 2004. It will provide the last of three capabilities that IBM believes are essential to the realisation of true on-demand or event-driven working.
The first of these, the ability of systems to sense and respond to different stimuli, is already ‘baked into’ IBM’s autonomic computing systems management vision, says Weisser. The second, an adaptive capability that allows systems to respond to changing events based on past experience, is also already embedded within IBM’s early on-demand offerings. The third capability, that Whitewater will empower, will introduce a proactive element to systems behaviour by diluting what Weisser calls the “specificity” that most conventional systems require to respond to events.
At this points says Weisser, the on-demand operating environment is really moving into the realms of the business process layer, providing business users with a way of applying business rules to systems in much less restrictive way.
When will all this happen?
Gartner’s view is that EDA will become part of mainstream systems planning within four years, and will be commonly used in new applications in that timeframe. But, as with SOA and web services, there are huge migratory, architectural and business strategy issues that must be tackled – not all at once, but step by step. “It will take at least 20 years for the notion of EDA to come close to the potential of what can be achieved” concludes Schulte.
|
|||