Sensory perception – Complex event processing comes of age

Many software vendors, especially in the business intelligence space, tout their products as ‘real-time’ systems.

On closer questioning, however, many will concede that what these products truly offer is not ‘real-time’ in its truest sense, but ‘near real-time’. Some of today’s business intelligence (BI) tools, for example, can give a company the ability to analyse data and to extract meaningful business information faster than ever before – but only once that data has been consolidated in a database and then interrogated using querying, reporting or other analytic tools.

It’s a question of latency. Traditional BI processes, where data must be collected and processed, introduce a time delay that doesn’t matter much when it comes to end-of-month, end-of-week or even end-of-day reports. And these days, it’s fair to say that modern BI tools can, when engineered correctly, do a pretty good job at closing the window of latency still further.

But it is also fair to say that those tools are simply not engineered to handle continuous, streaming data such as live trading information from financial markets. In order to tackle that kind of challenge, companies in the financial services industry, for example, have increasingly turned to another technology – complex event processing (CEP) platforms – that eliminates the latency and restrictions associated with traditional BI.

In a sense, a CEP platform is like an inverse database. Instead of issuing queries over data, a continuous stream of data, or ‘events’, is passed over the query. Or, in the words of Mark Palmer, CEO of StreamBase Systems, a rising star in the CEP space, these systems allow companies to “find the needle before it goes into the haystack”.

Based on the results of this analysis, a CEP platform can then “prompt applications or people to react to information by resetting processing priorities, changing online sales strategies, buying and selling stocks, or some other action,” explains Mike Gualtieri, an analyst with IT market research firm Forrester Research. “Analysing historical databases cannot provide answers to these questions – they require the business to act on urgent events as they happen. CEP platforms connect to live events from wherever they originate, be it a message queue, a click stream or a market data stream.”

Beyond banking

To date, however, CEP has struggled to find a strong foothold outside the financial services niche – but that is changing. Other industries are increasingly interested in capturing real-time events, but in many cases the events in question are not the system-to-system messages generated by trading applications but the alerts issued by sensors and tags attached to physical objects moving about the real world.

That will open up whole new areas of market opportunity for companies offering CEP platforms, which include IBM, Oracle, Progress, Sybase, Tibco and, most recently, Microsoft, according to Teresa Jones, an analyst with IT market research company Gartner. In September 2009 she estimated that market penetration stood at somewhere between 1% and 5%.

“Event processing is underused partly because, until recently, relatively little data on current business events has been available in digital form,” she says. “In the past, many events were either undetected or detected but not reported in a digital format that could be sent over a network or manipulated by computer.”

But the amount of available event data has escalated rapidly in recent years, she adds, and now amounts to an “explosion of event streams flowing over corporate networks”.

Radio frequency identification (RFID) tags and readers, bar codes, temperature and pressure sensors, global positioning system (GPS) devices and accelerometers are beginning to play a significant part in that explosion, and as the cost of these technologies continues to drop, the ability to identify and track physical objects or monitor their current state is likely to become even more appealing, particularly to companies in industries such as manufacturing and retail, where complex supply chains frequently result in mislaid goods and late deliveries.

In 2009, Jones estimates that the total market size for CEP software, including the vertical and horizontal applications that are built on top of CEP platforms, was $190 million. But by 2013, she predicts it will have more than doubled, to $580 million.

World in motion

According to Brian Innes, a technical consultant at IBM’s Innovation Centre at Hursley, near Winchester, the concept of using a combination of CEP and sensors or tags is attracting the interest of many of the centre’s visitors.

Some of them are from pharmaceutical companies that need to track products from the factory to the consumer. Some are from retail companies that are looking for technologies that can ensure that perishable food items are fast-tracked through the supply chain to reach customers in a fresher state. Others are from engineering and manufacturing companies that need a better way to locate vital components as they flow into and out of vast warehouse systems or a more accurate insight into the operation of machinery on the plant floor.

“We’ve already seen a number of pilots – the technology is there today and customers are rolling it out,” explains Innes. “But it often requires a change in mindset to make it work, and that can be a challenge.”

In part, that is because CEP requires closer cooperation between IT and business analysts. Because of its reliance on rapid-fire, accurate querying of data ‘on the fly’, a company needs to know what answers it is looking for from CEP ahead of time, he explains. As events flow over a CEP platform, events can be correlated with others to create new events, which in turn can be passed over other queries. “That puts an onus on the business analytics team to be able to tweak rules dynamically at run-time, in order to take advantage and make sense of dynamic streams of events,” he says.

“CEP is moving away from writing algorithms and towards defining rules,” agrees Alan Ward, a researcher at BT Research Laboratories. He is working on a series of demonstrations and prototypes of ‘smart environments’, where objects can communicate information about themselves and their current states, which can then be combined with other events on CEP platforms.

Ward sees tremendous potential in cross-industry applications. If road signs and vehicles were equipped with sensors, for example, the resulting data streams might contain ‘truths’ that could be of interest to government transport agencies, insurance companies and logistics operators alike. Likewise, ‘smart meters’ that collect information about electricity consumption, for instance, are an area of great interest to utilities companies, the emergency services and environmental agencies.

Some companies have got further down the road with tactical applications of the technology that have been up and running for some time, according to Giles Nelson, chief technology strategist of Progress Software. The company’s Apama software (which Nelson co-invented) is used in combination with RFID tags by Dutch bookseller BGN to track the movement of individual books. This means that in-store receipt of books can be automatically processed (instead of staff manually checking through boxes of deliveries) and, via in-store kiosks, customers can locate a particular book on a specific shelf in one of the company’s 40 stores.

And Royal Dirkzwager, the Dutch maritime logistics provider, uses Apama to automate the monitoring and analysis of real-time shipping information in order to track the arrival of cargo ships at ports around the world. Using CEP, it is able to dynamically adjust the course and speed of a container ship in order to optimise fuel usage, maximise shipping loads or take account of weather conditions and port berth availability.

“I think CEP’s a rotten term and I constantly find myself apologising for using it,” says Nelson. “But what the term encapsulates is actually pretty powerful. CEP can track and respond to millions of moving objects, likes trucks, ships, planes, packages and people, and that’s enabling customers to respond in a dynamic way to both opportunities and threats in supply chains, logistics, power generation and manufacturing. It’s just a better way to analyse events as they occur, in response to unfolding event patterns, rather than hours, days or even months later.”

Putting it into action

Companies that decide to pursue ‘situational awareness’ in, for example, their supply chain face a number of challenges. For a start, monitors that deal with events from two or more disparate systems are inherently more difficult to implement than those that focus on input from a single system or process, as a recent Gartner report points out. Capturing the event data is typically the most difficult aspect of these systems, say its authors, because it may involve modifying applications to emit events, installing software adapters [to ‘translate’ the input] or installing sensors in the physical world.

“Part of the challenge is technical – instrumenting the systems that are the event sources and converting event notifications into a format and protocol that the [CEP-based] monitor can handle,” they say. “Another challenging aspect is organisational – getting cooperation from the companies, business units and people that own and manage the systems, devices or processes that are the event sources.”

Even choosing the right system is likely to be a taxing process. Many of the smaller CEP specialists have been bought up by much larger companies (StreamBase Systems a notable exception) but the market remains “broadly distributed”, according to Jones of Gartner. Products are expensive too, she notes, with list prices currently ranging from $50,000 to $500,000. Although deal sizes appear to be falling “somewhat”, the top end of this price bracket can be hard to justify for many kinds of projects.

One organisation currently evaluating a CEP deployment is Royal Mail, which sees the technology as a way to track individual deliveries, and therefore to advise recipients of exactly when their parcels might arrive. Speaking to Information Age this year, chief architect Stuart Curley said that the technology is only now approaching affordability.

However, Microsoft’s entrance to the CEP market – announced in 2009 – is likely to impose downward pressure on cost, particularly because its CEP offering, StreamInsight, will be a feature of SQL Server 2008 R2 DBMS, available at no extra charge.

A popular, albeit clumsily named, concept in information technology is the ‘Internet of things’. This concept describes a world where machine-to-machine (M2M) communications enable intelligent devices and objects to ‘talk’ to one other and report on their current location or state. If this vision is to become reality, organisations – and the people who work for them – will need a way to cut through the chatter and listen only to those conversations that require action to be taken. And CEP is starting to provide an accurate way of doing just that.

The future is complex

From machinery in manufacturing plants to mobile phones and heart monitors, today’s devices are ready to start talking. In many cases, they are already chattering away, using M2M communication to report on their status, convey data to other machines and to receive instructions remotely.

It’s been a long time coming, but in 2009 the M2M industry finally gathered real momentum with the delivery of numerous robust, mature projects that enable physical objects to connect wirelessly using sensors, transistors and RFID tags.

By the end of 2009, according to analysts at research company Berg Insight, M2M communications accounted for 1.4% of all mobile network communications worldwide, although the share was higher in territories such as the US (4.3% of all mobile connections) and the European Union (2.4%). By 2014, Berg Insight’s analysts predict, M2M will account for 187.1 million mobile connections, or 3.1% of the worldwide total.

With far more devices coming to market enabled with built-in connectivity, interest is building in complex event processing technologies as a means to filter out some of the ‘noise’ and extract meaningful business insight from the M2M babble.

According to Teresa Jones of Gartner, the primary drivers of CEP technology in the years ahead will come from ‘sense and respond’ and ‘situation awareness’ applications that alert them to events that require intervention, be it automated or manual. These applications, she suggests, might include :

• Cargo tracking
• Fraud detection
• IT operations management
• Logistics, transportation and fleet management
• Mobile asset management
• Supply chain management
• Track and trace applications,
to report the history of food from ‘farm to fork’ or track pharmaceutical products from factory to consumer
• Vehicle security automation

Related Topics

Data Management