The pace of change in business has never been greater. Product lifecycles are increasingly fleeting and competitive advantage short-lived; inventory turnover needs to be fluid, supply chains ‘frictionless’.
Technology, of course, fuels much of that acceleration, and yet, for all its ability to speed up business cycles there are areas where IT’s ability to deal with true velocity remains in serious doubt.
Traditional business intelligence systems do not analyse current business performance. Depending on how often data is extracted from operational applications, the analysis can give a picture that is hours or even days out-of-date. Aware of that anomaly, business intelligence and applications vendors liberally sprinkle their marketing materials with the phrase ‘real time’; but in many cases, ‘real time’ is actually something that happened yesterday.
The fact is that the kind of real-time facility senior managers actually want – and need – is something that gives them a true understanding of what is happening within their business operations – as it happens. And arguably, such real-time analytics systems are a requirement: How else can a business provide an accurate picture of where it stands at any one point, both in terms of its internal decision-making and to satisfy its regulatory obligations?
Accuracy versus extraction
Notwithstanding such considerations, the difficulties of getting a real-time – or even a near real-time – view of data should not be overlooked.
Today, business applications involve the creation and processing of huge amounts of data. A set of ERP applications may hold data in a single store, but giving direct access to that data to decision-makers has historically been prohibited – primarily because queries to live operational databases greatly affects their performance, making them damagingly slow. So the cardinal rule has been to keep operational and analytical data sources separate.
Regular extractions of data from operational sources are, of course, highly valuable, and indeed the backbone of business intelligence, but in certain circumstances their accuracy is left wanting – something that is particularly notable when dealing with financial numbers. The picture is further compromised by the fact that few organisations have a single source of operational data. Typically, data analysis is carried out on databases that are assembled from several often-incompatible, often-overlapping and sometimes-contradictory sources.
For companies such as Danish medical equipment maker, Coloplast, these questions of inaccuracy and delay have meant that, despite its extensive investments in enterprise resource planning (ERP) applications and business intelligence tools, the company still had “trouble reporting out of the ERP systems”, says Jesper Kalenberg, director of business control at Coloplast. In particular, line-of-business managers needed the ability to closely track financial performance, explains Kalenberg. “During financial closing periods we were always making last minute adjustments, but with the tools we had it took up to a day, and that wasn’t good enough.”
Coloplast’s solution was to introduce reporting tools – in this case from the GL Company – that allowed its managers to directly access the operational data from its ERP system without impacting performance. Previously, these reports could only be generated from a staged data store – and reports that used to take between six and eight hours are now done in a matter of minutes, says Kalenberg.
Select areas
While such examples provide some compelling arguments for real-time intelligence, there still remains very little agreement about what constitutes ‘real time’. As cynics would point out, for most suppliers, real time has come to mean ‘as fast as their technology permits’.
At music company EMI, the traditional approach to business intelligence involved not only populating a data warehouse with sales data, but also allowing staff to manipulate the data with Microsoft Excel to create periodic reports. The Excel process alone took two days.
But against a backdrop of turbulence in the music industry, EMI’s senior managers demanded a faster analytical process. In response, the company developed Pulse, a platform that integrates business intelligence, analytics and reporting, document management and a portal using Microsoft’s SQL Server, Analysis Services and SharePoint Server. This provides a near real-time view of global campaigns, so that managers can track sales, broken down by geography, retailer or artist, and thus measure the effectiveness of any promotion.
Again, in this context, real-time analysis does not translate to ‘instantaneous’ but rather ‘very quick’.
At UK electrical retailer Comet, the use of real time varies across the company, depending on business requirements. It has used business intelligence to improve its home delivery and repair services by optimising the schedule and routing of delivery vehicles and repair teams across its 250 outlets.
The company also uses BI tools for its stock management and purchasing strategies, using a combination of custom-built tools and software from MicroStrategy and Conchango to optimise the allocation of sales space in its individual stores. Since the implementation, Comet has increased its profits by 20%. However, real-time BI need not be applied universally; whereas the delivery and repair teams are reliant on up-to-the-minute information, Comet’s store managers do not need to have the capability to revise their analyses as individual items are flying off the shelves.
As that suggests, many business leaders may find themselves seduced by the concept of real-time BI, but, in fact, they differentiate between areas of the business that need real time, and those that can be served with analysis based on a hourly, daily or even weekly data refresh. Such views contradict some of the marketing emanating from the vendor community, where technologies such as corporate performance management (CPM) have been touted as a means of providing business leaders with deep and, yes, real-time insight into current business operations through a dynamically updated dashboard of ‘live’ dials showing the status of key performance indicators. This can be misleading: much of the analysis is not being done in real time, and importantly, in many cases it need not be done in real time. As Nick Gomersall, senior vice president for sales at the GL Company says: “Tools such as CPM dashboards can be incredibly useful for some things. But no-one is actually going to run their business from a dashboard.”
The move towards a real-time infrastructure can also be expensive; but in some cases the difference between, say, a trading statement based on extracted and then manually adjusted data and one that is drawn directly from the operational data can be the difference between a compliant declaration and one that is legally questionable.
For example, managers at multinational brewing giant the Fosters Group wanted to improve the speed and accuracy of their financial analysis and so smooth the process of closing the annual accounts. The company uses a financial package from software giant Oracle, but using traditional online analytical processing (OLAP) tools to generate reports would take hours. Having introduced GL’s Accounting Intelligence tools which links directly into the database without the need to stage the data, Fosters can now produce trustworthy, online reports in minutes, dramatically improving its ability to roll-up its period-end numbers.
As that suggests, there is pressure on technology companies to make business intelligence much more real-time, so businesses can see exactly where they stand on any operational aspect and react to changes swiftly and confidently. Accounting is one area where demand has surfaced. Other areas are manufacturing and Internet sales; but companies are beginning to be aware that while sometimes yesterday’s data will do, in many cases the ‘truth’ is only available in real time.