One reason why encouraging adoption of business intelligence systems can be so difficult is that many users believe they already have perfectly adequate systems. These are often based on Microsoft’s desktop tools such as Access databases and Excel spreadsheets.
The second of the panel debates at Business Intelligence 2008 acknowledged that the popularity of these tools often jeopardised centralised BI projects.
Jason Francis-Sutton, head of business intelligence mortgage services provider Homeserve, says that the use of these systems is a serious threat to data quality. “It is very dangerous. But it is born out of frustration at not being able to get the right information at the right time,” he said, adding that only a fully mature data warehouse can provide the kind of immediacy provided by local spreadsheets.
Anthony Howcroft, European managing director for data warehouse appliance DATAllegro, highlighted the scale of the problem. “One of our customers was working to replace more than 6,000 spreadsheets [with its BI system],” he outlined.
But there is good reason for their popularity, he added: “People use Excel and Access because they are good tools.”
Necessary evil Chris Lees of property industry IT consultancy Calvis echoed this thought. “There are risks associated with spreadsheets, but they are an absolutely necessary evil,” he said.
“If you look at how companies in the real estate industry reacted to the sub-prime crisis, spreadsheets gave them far more flexibility than any tool delivered by the IT department could.”
DATAllegro’s Howcroft warned that businesses try to impose BI tools from on high at their peril. “Organisations have to be more embracing of the BI tools that end-users are already familiar and comfortable with,” he said.
But doing that while driving data quality is, the panel agreed, a challenge to which there is no simple solution.
Some look towards the technology providers to help them improve data quality. Oliver Berchtold, a data management and reporting expert at global accountancy firm Ernst & Young, said that what was required for effective data quality was the ability
to record business context alongside the data itself. “We are looking for a data warehouse with the capability of recording business metadata,” he said.
Francis-Sutton echoed the sentiment that organisations need to tie down the business context of data. “At one point, we had four separate definitions of a full-time employee,” he said.
But for him the root of the data quality problem lay in human resources, not technology. “Our biggest problem is the lack of skill internally surrounding data governance,” he said.
Calvis’s Lees reported that, in order to ensure data definitions reflect the requirements of the business, some organisations employed collaborative tools, such as wikis, to allow the organisation to define them.
“Once you’ve set up standard data definitions,” he added, “you need to make them available at the time of need.” To ensure that deviations from standard definitions are not introduced, it must be easy for workers to use the definitions properly – they must therefore be integrated into the tools people use.
This point underlines the challenge that organisations face in balancing end-user tools with centralised BI initiatives.