For any CFO concerned about throwing good money after bad on big data deployments that are simply failing to deliver the information the business needs, there are three essential questions to ask. Why does the business need to collect this data? How is it going to provide value? And, critically, if the way IT has attempted to manage big data to date has not succeeded, isn’t there a better way?
Trying to solve this essentially 21st century problem with outdated and inappropriate 20th century technology is not only futile but unnecessary given the latest generation of database technologies designed to turn data – in any volumes – into timely business insight with minimal investment.
And if the CIO is not prepared to face up to the limitations of traditional relational database technology, isn’t it time for the CFO to push for radical change?
As Matthew Napleton, marketing director, Zizo, asks, just how will the CIO justify an investment in outdated, outmoded big data technologies over the latest solutions that can be deployed and delivering insight within months, at a fraction of the cost?
Justifying investment
Big data is dominating the CFO’s agenda with CIO’s requesting ever-increasing budgets to keep pace with the technological challenges of managing ballooning volumes of data.
But how does a CFO measure the value from this additional IT spend? Of course the ability to rapidly analyse data in theory offers huge benefits in improved business understanding and increased agility. But are any of these projects delivering?
And yet most businesses still don’t get the timely answers they need despite this vast investment in data technology.
>See also: Big data: not a magic pill, but an antidote
The situation is set to get worse. While the vast majority of CIOs have yet to wrestle control over structured data and deliver the insight required to drive operational improvements, Forrester Research estimates that over 45% of big data deployments are now for marketing.
How can this be justified? How can an improvement in sentiment analysis from social media, for example, deliver bottom line benefits to a retailer unable to gain the insight required to improve day-to-day logistics and stock control?
Once again, the CFO is being asked to pile even more investment into the big data project. But is this simply a case of throwing good money after bad?
Calling time on big data
And the problem is not new. Organisations have been struggling to come to terms with growing data volumes for years.
Despite the continuing advances in technology – often expensive, leading edge technology – the problem just doesn’t seem to want to go away.
Organisations always seem to have too much data to analyse and not enough computing power to do it in a reasonable timeframe. This is because there are fundamentally two problems with this approach.
Organisations are trying to collect the data from too many sources – often without good cause; and they are attempting to analyse this data with the same tools that have been used for decades and have not really changed.
The result is an entry-level investment point far beyond the budget of all but the largest organisations and a technology model that is simply not fit for purpose.
So where next for the CFO’s big data budget? Is it time to call a halt to an IT investment that is not and will not deliver value? Or is there a way of harnessing this data to provide business insight?
The relational database (RDBMS) has a place and a purpose, but it is not a tool that was ever designed for the data volumes it is now being asked to handle.
Despite upgrades, add-ons and a vast array of clever engineers, when it comes to big data, the RDBMS may well have had its day.
There are however, new database technologies that have been specifically designed for this new era of data management. These solutions have been designed not only to manage vast data quantities but also to compress that data into manageable – and affordable – volumes and provide the business with the insight required.
Working in parallel with today’s operational systems and existing data warehouses, the latest generation of database technology requires just a server, not an entire data centre, to run.
Exploiting innovation in areas such as data compression and pattern matching, these solutions require not only minimal infrastructure – and hence cost – but deliver a new way of locating information within the mass of data to enable rapid response to critical business questions.
The result is timely business information at a speed of delivery and a fraction of the price of any legacy technology. This approach turns the whole, misguided concept of big data on its head: suddenly data is not massively unwieldy and unmanageable but hugely valuable.
There is nothing wrong with the concept of turning data into valuable information. The problem has been the way the IT industry has attempted to achieve this goal.
At best, the big data explosion has become a distraction from the essential business need of using information to drive operational improvements.
>See also: Big data and mapping – a potent combination
At worst, CIOs are being coerced into a massive investment in big data technologies to capture unstructured social data – despite the clear lack of success in managing structured operational data – for fear of falling behind.
Either way the CFO is now plagued with requests for investment in big data solutions that, quite frankly, cannot be justified on the basis of business benefits.
By persisting with the current investment in outdated, inefficient and overpriced big data solutions, CFOs are throwing good money after bad.
It is time to accept that attempting to manage fast escalating data volumes with a traditional RDBMS does not work. But is the CFO brave enough to demand that IT takes a step back, explores the latest innovative tools and techniques for managing data, and makes a small but justifiable investment that has a real chance of delivering a return?