There was a time when a business analyst would pore over last month’s figures and try to spot clues of the company’s performance. But times have changed.
Not only are business leaders demanding up-to-the-second information about company performance, they expect their analysts to be able to include information from all manner of sources as they strive for a competitive edge.
In some cases, the analysts are not just being pushed to provide insight into past performance, they’re expected to predict the future. Below, Information Age looks at how a handful of organisations have deployed this advanced analytics technology, and explore some of the benefits that it has given them.
Fashion forecasting at catalogue retailer Otto
When you’re in a business as fickle as fashion, forecasting the future might seem to be an impossible challenge. But for German catalogue shopping titan Otto (best known in the UK for its Freemans Grattan brand) predicting shoppers’ habits is the key to success.
As Thomas Friese, senior project manager for forecasting at Otto, explains, predicting the volume of sales for a particular item is critical to driving down shipping costs.
“Many of the clothes we sell are manufactured in China, so a substantial portion of their retail cost is shipping,” Friese says. “If we can order enough in advance, we can ship the items across on cargo ships. But if we need more than we’d bargained for, we have to fly them over, which costs much more.”
Beyond shipping costs, predicting demand for a particular garment is fundamental to catalogue fashion retail. Anything that does not sell in a season gets put in a sale, where the aim is to recoup as much as possible from a depreciating asset.
To make a prediction about likely demand, Otto’s forecasting team take huge quantities of data from a plethora of disparate sources. The team extracts more than 100 million records per day, including historical and real-time sales data and supply chain information – all with varying degrees of quality.
This data is extracted from operational systems and loaded onto predictive analytics software from Blue Yonder.
Before performing its analysis, the system first checks the data to see whether it is suitable for forecast. This can involve checking through a petabyte of data.
The system then analyses data, including how much a particular item has been advertised, sales patterns from the previous week and which market the item is aimed at, and produces a sales forecast. Using Blue Yonder has improved Otto’s ability to forecast demand, with predictions now up to 40% more accurate, says Friese, driving up profitability at the firm.
The system is also being used to manage its returns business. Many of its customers will readily send items back if they’ve changed their mind since buying them. While Otto aims to make this as easy as possible for its customers, it adds a significant cost to the business, as it pays for the returns to be sent through the postal system.
“If we can minimise the number of items being returned to us, it has a big impact on our bottom line,” says Friese.
The predictive analytics system from Blue Yonder has been used to identify successful strategies to reduce return rates in its online business. For example, the firm was able to see immediately the impact of getting its sales teams to talk telephone orders through with customers.
Increasingly, Otto is using Blue Yonder to link its traditional catalogue business with its online operations, says Friese. “The online side means that the volume of data we generate is growing rapidly,” he adds. “If we had to rely on traditional BI tools, we’d never cope.”
Andrews Sykes plans best delivery routes for rental equipment
Even meteorological scientists find it hard to predict the weather. For a business as seasonal as Andrews Sykes, a UK provider of portable heating and air conditioning units, the weather forecast provides little in the way of forward clarity.
“We keep one eye on the forecasts, so when the country was facing floods [in late 2012], we were as ready as we could be to get our pumps to customers,” says Roger Pearson, business systems manager at Andrews Sykes. “But the forecasts are too changeable to plan too far into the future.” Where Andrews Sykes can apply a degree of data analysis is in planning its inventory.
The company operates a rental business, so it needs to decide which equipment to lease to which customers tomorrow based on where it is today. “Making sure we send the nearest available item helps us cut down on our fuel bills,” explains Pearson.
This means working out the optimum delivery route for thousands of items at locations up and down the UK every day – and quickly.
Andrews Sykes has built its advanced analytics capabilities on IBM’s Cognos business intelligence platform. The system pulls in inventory data from the company’s applications, and combines it with postcode data and geo-location information from Google Maps.
“What this gives us is the ability to see where all of our assets are, where our clients are, where the equipment has to get to, and the best route there,” explains Sykes. “Also, we can look at historical demand, to see how future demand may pan out.”
A quirk of history means that Andrews Sykes has been able to get the system running in real time – i.e. data is available for analysis as soon as it is entered into the business applications.
Before implementing the Cognos systems, the company had used an ERP system from a supplier that had gone out of business. It was able to secure the source code of that ERP software under escrow, and could employ the developers that had worked on its implementation.
That allowed it to build a real-time link between the ERP system and Cognos, instead of relying on overnight batch processing. “Usually, we would not have been able to do the real-time analysis that would give us insight into our asset locations,” admits Pearson.
The ability to interrogate up-to-date information about its assets is helping to drive up utilisation rates of equipment, ensuring that every asset delivers as much value as possible to the business.
Leicester Tigers add analytics to injury
Andy Shelton, a sports scientist working at rugby club Leicester Tigers, is pretty bullish on his team’s chances of success. “When it comes to pitting our first-choice 15 against any other team’s, I’d back ours to win,” he says.
But, sporting success is not just about the first-choice players. As the cliché goes, it is about strength in depth. And as Shelton admits, the Tigers squad is simply not as large as some of its rivals, particularly the French sides.
To help level the playing field, the Tigers have been using predictive analytics technology to help minimise the number of injuries that their star players pick up. The system, built by specialist technology supplier Edge 10 using statistical analytics package SPSS, attempts to identify players at risk from injury and adjust their training regime in order to minimise the likelihood that the player cannot play.
By far the most common type of injury in rugby is soft tissue injuries, which are preventable, says Shelton. Muscles that are approaching fatigue are more susceptible to tearing. If signs of fatigue are picked up early, the coaches can adjust a player’s training programme so the muscle at risk is rested while the player focuses on other exercises.
Of course, in such a physical game as rugby, it is impossible to prevent all injuries. But if a club can cut the number of soft tissue injuries, there is a far better chance that a team will have its strongest possible squad available – and that should improve its chances of winning.
The IBM predictive analytics tools, based on its SPSS Modeller technology, compare past player stats to those that come back from the sessions, pinpointing any subtle differences in muscle performance that might be indicative of fatigue.
But this biometric analysis is supplemented with other data. For example, players are regularly asked to complete questionnaires exploring their activities away from the club – for example, how they slept and whether they’ve been feeling stressed. This additional personal information is used alongside the monitor data, helping the clubs analysts to discern other factors that might influence a player’s performance level come match day.
Sport is increasingly becoming a numbers business, says Shelton, so the coaches were keen to see what the predictive analytics system could do. “It’s about anything that can give us an edge,” says Shelton.
Ecclesiastical Insurance includes unstructured data in analysis
At Ecclesiastical Insurance, which has a 125- year history of protecting churches and other speciality risks in the charity and education sectors, the importance of data is recognised at the very top level of management.
“We have five top priorities each year, and information management has consistently been in there for the past few years,” explains Steve Blackburn, head of business intelligence at the insurance company.
That led to a data quality drive at the company. “A few of the management team were become increasingly frustrated about their ability to make decisions, or to make things happen quickly,” says Blackburn. To address data quality, it built a data warehouse using a combination of Microsoft’s SQL Server database platform and IBM’s Cognos business intelligence tools.
But the firm soon began to realise that much of the data it uses to calculate insurance risks was held outside the organisation. For example, a lot of data about property is held on local council planning websites. But this data is often unstructured and its quality is not always clear.
In order to integrate this data into its calculations while keeping a handle on quality, Ecclesiastical Insurance has implemented a new data management platform from DataFlux.
It has started working on reports from the Charity Commission, which provide postcodes and address details for all the organisations within its remit. The DataFlux system cross-checks this data against the company’s existing records, identifying discrepancies and even evaluating which source is more likely to be correct.
“This is actually helping us to drive up our data quality,” says Blackburn. “It’s never a black-and-white decision about who has the more accurate data.”
The insurer is now looking to integrate geographical information systems to inform its risk assessment of buildings – assessing the flood risk to a building is made a lot easier when you can see where the nearest rivers are – and it is also contemplating the use of in-memory computing, to improve the speed of its analysis.
“The price of memory is coming down, which makes it really interesting for us – although it’s still an expensive thing to get up and running,” Blackburn says.