There’s no doubt about it, banks have been having a rough time of it recently with their IT systems. In the last year alone nearly all of the major lenders have found themselves suffering from infrastructure downtime or facing off against IT glitches leaving their customers feeling stranded, disappointed and considering alternative services.
For many of the IT issues the banking behemoths grapple with constraints over the enormous amounts of data are likely to be at the root of the problems.
But the banks aren’t alone in suffering from downtime and glitches. As the trend for Big Data, mobile and cloud services grow in popularity we’re seeing organisations in all industries start to collect increasingly complex and large data sets in more and more applications. Now the IT systems that organisations have in place are struggling to cope with the volume and complexity of the data at the rates it needs to.
> See also: Natwest, RBS banking system down for hours
Typically there are two big pain points for data intensive organisations:
Legacy Infrastructure
As technology evolves many of the systems and servers which organisations are using to store and manage their data are unable to cope with the sheer volume and complexity of the data sets, making them slower and less responsive.
But trying to modernise these systems isn’t always easy. One financial institution we work with had calculated that modernising their infrastructure using traditional means would take the equivalent of 625 work man years at a cost of over $1 billion.
Application development
The ability for teams to smoothly role out application refreshes or develop brand new applications is vital for businesses to be competitive. However, all too often the development process is constrained by access to fresh data. This either means working with old data or waiting days, weeks or even months for DBAs and Storage admins to provision new data sets.
This leaves you with two ugly choices. Develop and test against old data inevitably resulting in bugs. Or wait for fresh data making the development process take infinitely longer to complete. This then puts DevOps teams under pressure to deliver more applications in a shorter time frame, and rolling out applications, which haven’t been sufficiently tested.
> See also: IT glitch downs NASDAQ stock exchange
We’re now seeing organisations adopting new technology such as data virtualisation which can fix these problems and enable data to be more agile. As a new architectural layer data virtualisation can help businesses in some important ways:
Virtual data environments can be delivered on-demand in minutes, with developers able to self-service data refresh, resets, bookmarking and branching removing the data bottleneck.
By centralising data provisioning into a separate platform, companies can secure and control data, track versions and changes, and preserve a full data history.
> See also: Active Directory glitch caused IT outage, says NHS Greater Glasgow and Clyde
This new virtualisation layer enables copies of data to take up a tenth of the space than it did before, saving money and giving legacy storage infrastructure room to breathe.
Creating agile data also means companies can double down on their application development vastly improving the rate at which teams can turn out new applications which have been properly tested.
With the available resource, organisations can spend more time innovating and less time ‘keeping the lights on’.
In our data age organisations or any type can’t afford to be suffering from downtime and glitches, in order to keep up with the competition they need to be able to change direction at 100mph. It’s time for them to take charge of their IT infrastructure and adopt plans for agile data, in doing so they might finally be able to truly reap the benefits the data age promises.
Sourced from Iain Chidgey, VP and General Manager EMEA, Delphix