You need to start at the beginning. And Greg Hanson starts with a finding from Gartner, “Increasingly by 2020, customers will make decisions based on experience rather than traditional buying habits around price and delivery or availability,” he says. That is where data enters the tale.
“I get letters through the post addressing me as Greg, Gregg, Gregory, Gregory James (where they’ve got my passport information for some reason). I even get letters from the same parent company that are addressed to me in different ways. That suggests that they have no single view of me and my engagement with their company.”
The CTO role: ‘It’s about planning and business opportunities’
Greg Hanson (with two gs, one at the beginning one at the end) is the CTO of Informatica across EMEA, (that’s Europe, Middle East and Africa), and Latin America. “Informatica,” claims Gregg, (woops Greg) “is the world’s leading enterprise cloud data management company.”
“And there is nothing worse than experiencing bad data or inappropriate data in an app,” he says.
So, how do organisations get it right? Gregory James takes us through three stages.
The importance of data mining
The first aspect of data considerations: the standardisation of data
It seems that digital transformation journeys have hit a block. Greg explained: “Often there’s been too big a focus on the technology side.
“Yes, there’s a technology element, but what you really want as an organisation in the digital world is agility, the ability to do things quickly, to respond in real time.”
In short, it’s not about technology it is about agility, technology is just a means to the end.
That’s why it is important to standardise data. “fragmentation of technologies,” suggests Greg, is the enemy of agility.
“Standardising the technology stack wherever possible is critical. If you don’t do that then you get a fragmentation of skill sets which hampers organisations in their ability to drive innovation, change and agility.”
Put your data to work
The second aspect of data considerations: data strategy
If you go and speak to the chief data officer or chief risk officer at many B2C style organisations they won’t know where all their customer data resides.
“So, how can we possibly start to build an accurate picture of how our customers have dealt with us historically, what the patterns and trends are, and how we can use that information to intelligently guide what we do moving forward?“
Dealing with this takes a multiple approach:
- Discovery of data: The starting point is a discovery exercise. “This involves understanding where data resides.
- Classification of data: Then, building upon data discovery is classification. This often entails applying artificial intelligence to rapidly build up the catalogue based on that discovery exercise. This will help organisations develop standards around what they understand by customer.Greg then gives a proviso: “It might seem to be a simple thing, it’s actually not for many organisations who have got multiple CRM systems, they’ve gone through acquisition and are quite complex entities.” Greg summarised it thus: “We can leverage artificial intelligence to help us in the discovery and then using artificial intelligence, we can also classify, type, index data and suggest to an organisation what their customer data looks like.
- Catalogue of data: This drives informed decisions — once you’ve got a centralised catalogue of information; you can start to do things like bring customer data together.
- Single view of data: This is where the single view of data enters the story: “When you bring customer data together, you’ve got a series of duplicates of information which you need to standardise. This will enable you to build what’s called a single view of data or a master record, or a golden record. This is where names Greg, Gregg, Gregory, and Gregory James enter the story, a single view of data or master copy of data could, for example, help eradicate such issues.
- Bench lining of data. Initially, after the discovery and classification of data, we then start to build measures, a bench line of data, of what good data looks like, so that we can measure the existing data sets against that and then start to build remediation to help fix data in an automated way. In short, applying business rules.
- Fix-first-time principle of data: “Once we’ve built those rules, we can then deploy them, introducing what we call a fix-first-time principle, which means building it into the area or point where data actually comes into the organisation. So that could be through an application if you’re registering a new user, it could be through a call centre with one of our agents, or it could be in store across the omnichannel experience.”
- Leverage the data: Once we’ve got an approach to data quality, we can leverage it.
- Data scientists: “This catalogue of data information means that organisations have an identification of where their high quality data assets reside. That can drive a lot of value for an organisation, but only if it’s made available to the people who can translate it into value. So, that’s data scientists, for example, and line of business managers, who can really leverage that data to offer those innovative services in real time — product, price, promotions, place, driven by high quality, real-time data.”
- The data interface: The way to make that available is to have a business-friendly Google-like interface that can help search those data assets, and that’s what cataloguing of data an information can provide. Users can come in, query and find accurate, complete and consistent data and then make high quality decisions which will ultimately drive more revenue, more cross-sell, more up-sell, lower customer churn, for example in typical B2C industries.”
The third aspect of data considerations: data culture
So far so good, but there has to be a corporate culture conducive to the optimal use of data, too.
For a cultural strategy, the best organisations start to build this into the DNA of a company. GDPR and other regulatory compliance considerations have helped force the issue. “The best companies, in my opinion, make the governance and quality and security of data part of the DNA of an organisation, so tasks such as extracting data onto USB memory sticks starts to become something that is treated as culturally unacceptable, for example; or taking data out and sharing it on insecure file servers becomes culturally unacceptable. And the way to build a culture like that is right from the top.
Demographics is an issue too. “Generation X and generation Y are increasingly security aware.” These generations typically see organisations as the custodian of their data — and absolutely not owners. This emphasis of acting as a custodian of data should be built into an organisation’s DNA. “If you’re not acting as a good custodian of an individual’s data, they won’t want to deal with you any more. They will want to deal with companies who not only have a good environmental policy but have a good security and governance policy about custody of ‘my data’ while it’s in their hands.”
Building a data culture to accelerate digital transformation
“We’ve already seen existing examples of that where poor cultural data governance has been a key factor in data breaches. We’ve seen it have an impact on their brand recognition and sales and share prices of those specific companies.”
There is just one more thought for consideration. Greg put it this way: “There are an increasing number of organisations that, as part of that cultural deployment around good governance of data, have begun to remediate people based upon they’re demonstrating good culture and good activity around managing data.”
But building data related to KPIs into an organisation is a topic for another day.