The mainframe has a bad reputation. But, the technology is not the issue.
The challenge for the large enterprise is getting the most out of this often labelled ‘outdated’ technology — getting to an environment that’s more cost effective, more innovation-friendly and not dependent on a declining skill set, “which is to say baby boomers,” says Dale Vecchio, chief marketing officer at LzLabs.
This is not an impossible task, far from it, and organisations — with some help — can move mainframe data and applications without change. “You don’t have to recompile them, you don’t have to change the data,” continues Vecchio. And, this reduces the complexity and risk of moving applications to a “modern x86” or cloud environment.
But, let’s take a step back. What’s the fundamental challenge of utilising your mainframe?
How can organisations reduce the risks (and overcome the challenges) of moving mainframe workloads and applications to an environment that is most conducive to innovation and agility?
Cost, skills and agility
For most organisations, the challenge of the mainframe surrounds cost, skills and agility. These three factors haven’t changed in the last 10 to 15 years, and they are the issue.
No one ever says ‘this mainframe is bad, I need to get my business off it‘.
It is, however, expensive to run. “If you’re a smaller mainframe shop and you get off, you’re going to save 70% or more of what you spend on the mainframe to run the exact same workload,” says Vecchio. “If you’re very, very large, it’s a different story.”
“Multinational established businesses and brands are finding it increasingly difficult to keep pace with more agile ‘born-on-the-web’ competitors who are able to leverage the rich data they hold to offer better and more innovative services. In fact, according to recent research from Microsoft and LzLabs, 71% of global IT leaders said that the inflexibility of their mainframe limits the ability of their IT department to innovate. Large organisations still dependent on legacy systems need to begin the process of freeing their data from these systems and opening it up to the data analytics revolution.”
The largest enterprises, think Amazon, Google or IBM, can rationalise the cost — if the platform is effective, secure and reliable. But, and here’s the second challenge, you’re still dependant on a workforce that is declining.
Mainframers, for the most part, are members of the baby boomer generation. And, they’re leaving the workforce. In spite of efforts by IBM and others to try and grow a younger version, “globally that’s not been successful,” argues Vecchio.
Mainframe programmers can have beards too
So, the mainframe can be very expensive, the people who really know it are retiring and it’s increasingly difficult to replace them.
Outsourcing could be a solution, but Vecchio believes this comes with its own problems: “pretty soon that outsourcing company is going to know more about your applications than you do and that’s never a good spot to be in.”
The final mainframe hurdle surrounds agility and innovation. Access to data is part of where innovation comes from. And moving this off the mainframe is a challenge. The mechanism by which data is stored, the way it’s represented in a mainframe is different “than any other platform in the galaxy,” says Vecchio.
“The IBM mainframe is the only one that represents data in what’s called EBCDIC. This particular format is unique. As long as that is encoded in that way, it’s difficult.
“Unless the data is in Db2 — which gives you at least half a chance because IBM does have analytics solutions for Db2 — and as long as you’re willing to spend the money and be dependent on a single vendor for that solution; that’s a partial solution. But, there’s a lot of data in the mainframe that is stored and accessed in different ways. And those different ways are extremely difficult to access and use for analytics in a timely fashion.”
“Mainframes are good platforms, lots of big companies are still running them, but that’s fine as long as you can continue to minimise the risk of failure. If something fails now, who are you going to call?” — Vecchio
Bridging the gap between mainframe and modern analytics
What is the most critical way for companies to get the most valued types of data? According to a PwC, it’s using what you already have.
If your organisation has got a mainframe, what you already have is sitting in there, but you can’t get to it. It’s difficult to just separate the data from the application, because the data has been created by the applications. It’s not easy to modify applications on the mainframe and this inflexibility is hindering innovation.
“The tricky part is not only how do we get the data out of there so you can use it, but how can we still preserve the original mainframe application’s ability to run,” asks Vecchio?
Well, you need a partner. Solutions, such as those offered by LzLabs, can move this mainframe data into Postgres, which makes it immediately accessible to any modern data analytics tool, without compromising the legacy application. “This is what attracted me to LzLabs,” says Vecchio, after 18 years as a Gartner analyst, covering the mainframe modernisation space.
Open source now takes centre stage. If organisations can get this data into the world of open source innovation, any modern x86 or cloud environment, they can get full advantage of the workforce and the brains of the open source community. “Organisations can have access to any analytics tool, from any vendor or any open source solution,” explains Vecchio.
Extracting value from data: how to do it and the obstacles to overcome
“The first step to modernisation is getting the data off the mainframe and reduce costs, without going through the risk of rewriting or moving to a large-scale package or converting all of the data,” he says. “That’s not rocket science but it’s not easy either, because a lot of the meaning of the data is in the application.”
Leveraging innovation
A significant amount of business critical data resides on the mainframe. According to a 2018 article from Chris O’Malley — CEO of Compuware — in DataCenter Knowledge, ‘57% of mainframe users currently run more than half of their business-critical applications on the platform, with this number expected to increase to 64% by next year.’
The combination of different data sources is key to leveraging innovation.
Let’s use retail as an example.
How do Amazon know what to recommend to their customers? They’ve been managing transactional data on the mainframe. But, organisations need to be able to blend that transactional data with demographic data. But, demographic data is not on the mainframe; those applications were built before demographic data mattered.
Two-platform IT: Why the mainframe still has its place in the modern enterprise
It need not be an either/or for cloud and mainframe to drive digital innovation. Read here
Retailers now need to know; what you’re buying, where and when you’re buying it. They need to know who you are, how old you are and as many other details as possible.
Applying modern analytics tools to these different data sources — from the mainframe and IoT devices, for example — in an open source environment allows organisations to leverage innovation.
The data on the mainframe is encrypted by its legacy implementation.
“It’s about bringing this transactional data into a world where you can predict things,” says Vecchio. “You need to put yourself in a spot where you can leverage innovation, where you can leverage geographic data, social media data, or something else along with historical transactional data.
“You want to do that in an efficient way. You don’t want to have to go through some kind of technological unnatural act in order to make that happen. People need to be quick and efficient and I think these are the kinds of things that are driving people to the world where that happens, and that’s increasingly x86 and open source.”
“You might say mainframe storage is the Latin of modern technology. It’s a dead language but if you can understand it and know how to read it, you can leverage the data contained in it” — Vecchio