If the 1990s and 2000s were the era of business applications, then the 2010s are shaping up to be the age of analytics. At least, this is what a number of vendors would have us believe.
In 2010, IBM continued to push its ‘Smart Planet’ vision, which portrays the future of business management – and society at large – as data intensive and highly analytical.
“The knowledge of the world, the flow of markets, the pulse of societies… can be turned into intelligence, because we now have the processing power and advanced analytics to make sense of it all,” said IBM CEO Sam Palmisano at the start of the year, in a speech heralding the ‘decade of smart’. “With this knowledge we can reduce cost and waste, improve efficiency and productivity, and raise the quality of everything from our products to our companies to our cities.”
Of course, the computing giant hopes that in realising this vision, businesses and governments will buy components of its ever-widening array of analytical technologies. In 2010, it added to this roster with the acquisitions of web analytics provider Coremetrics and data warehousing appliance vendor Netezza.
IBM is not the only IT giant that sees opportunity in analytics. Systems giant Oracle is pitching its Exadata appliances – built on technology it acquired along with Sun Microsystems at the start of the year – as a direct competitor to IBM’s analytics systems, while storage vendor EMC’s acquisition of data warehousing appliance vendor Greenplum in July 2010 brought that company into the fray.
Applications vendor SAP, meanwhile, announced the forthcoming release of HANA, an analytics appliance built in part on technology it achieved as part of its May acquisition of Sybase.
Appetite for data warehousing systems – which are typically used to compile all the data a business needs to analyse – is certainly growing. According to analyst company The 451 Group, the data warehousing market will increase 11.5% each year to reach a total value of $13.2 billion in 2015.
But what about the business intelligence tools that organisations use to interrogate and analyse the contents of those data warehouses?
The Effective IT Survey found that only 6.2% of respondents adopted a corporate-wide BI platform in the past year. And according to market analyst IDC’s estimates, the market for BI tools is growing at just over half the rate of the data warehouse market (as estimated by The 451 Group). The combined market for BI tools will grow 6.9% each year to reach $11.3 billion in 2014, IDC predicts.
A little more insight into the BI adoption trends can be gleaned from the recent success of open source BI software vendors. Jaspersoft, for example, was named the fastest-growing vendor in the BI space by analyst company Gartner in April 2010, while Pentaho predicted 150% growth for the year. Both companies are much smaller than the established vendors such as SAS, SAP and IBM, but BI growth for those vendors was sluggish at best.
There are a number of reasons for the success of open source BI suppliers, including the growing acceptability of open source software in the enterprise and budgetary constraints on the IT department.
But the fact that open source providers can make inroads into the market also suggests a degree of commoditisation in conventional BI technology. When there is little to separate the available options, businesses may well plump for a ‘good enough’, cheap alternative (although neither Jaspersoft nor Pentaho would describe their software as such).
On the surface of it, this view of the BI tools space as a commoditising market seems at odds with IBM’s analytics-driven vision of the future. All it really means is that the traditional model of business intelligence, in which a data warehouse supports standard reports or dashboards, is well established enough to come under price pressure from low-cost suppliers.
Beyond that traditional model, however, there are all manner of innovative analytical technologies filtering through to the mainstream that promise businesses the ability to get more insight from their data.
Alternative analytics
Some of those innovations relate to the database that supports analytics. In-memory databases, while not themselves a new development, are becoming cheaper thanks to falling component prices. BI suppliers such as QlikTech and TIBCO Spotfire, whose tools are based on in-memory databases, argue that the speed of in-memory systems allows business users to explore company data without having to forewarn the IT department that they need a specific report. Now conventional BI vendors are catching on: SAP’s HANA, for example, is based on an in-memory system.
Another field of database-level innovation is massively parallel processing. Again, MPP – in which data processing jobs are broken down into chunks and conducted in parallel – is a technology that has been around for some time. Teradata, the company that arguably originated the data warehouse appliance market, has always used MPP in its systems.
The technology has seen a recent uptick in popularity, especially among web companies, as it allows them to analyse the behaviour of individual users, rather than having to extrapolate trends from a sample.
Changes are also under way at the user-interface end of BI systems. A popular fad for infographics and data visualisation coincided in 2010 with the rise of companies such as Tableau Software, a BI vendor whose tool visualisation software was designed by the man behind 3D animator Pixar’s rendering engine. Tableau grew sales by 123% year-on-year in the third financial quarter of the year.
Meanwhile, the continued rise of social media has provided businesses with a whole new data set through which to analyse customer behaviour.
The armoury of analytical tools available to businesses now stretches far beyond the tried and tested BI infrastructure. However, that businesses have the skills, the culture and even the inclination to exploit these tools is by no means a foregone conclusion.