Struan Eves, a strategy consultant for Royal & SunAlliance, knows a lot about those challenges. Until recently, the insurance giant had “every BI tool going, from front-end analysis tools and database tools to extraction, transformation and loading tools.”
But over the past two years, R&SA has tried to bring more order to its data. In conjunction with IT services partner Accenture, it has rolled out a project to build a single data warehouse in which data from all across the company is aggregated. Moreover, it has standardised on a single suite of tools for analysis.
The main project drivers were simple to state and harder to achieve: to make significant improvements in operational efficiency, explains Eves. Through building a single enterprise-wide data warehouse, R&SA expects to gain a tighter reign on operational expenditure while honing its risk analysis processes.
Non-trivial operation
Building a data warehouse capable of holding R&SA’s customer records is no trivial task. The warehouse will hold over 10 terabytes of data when the project is complete in 2007. It will contain all the data relating to R&SA’s claims and policy administration for both business and consumer markets in the UK.
Establishing such a massive warehouse also presents issues over how different data should be classified. “There’s a small proportion of our data where real-time analysis is important, and we’ve looked at how we can address that,” says Eves. “Mostly, the rest of the data isn’t refreshed daily, but we need to be able to manage both types.”
Eves’ team at R&SA have taken a staged approach to the project rollout, delivering parts for the financial, commercial and consumer functions at different times. As each part of the warehouse has been built up, the team has also had to impose exacting data quality standards on users. “We have strict benchmarks to ensure that the quality stays within certain tolerances,” he says.
Further reading in Information Age
Information Age’s Big Data briefing room