It is quite common these days for emerging technologies to be accompanied by intense levels of hype, a prime example being artificial intelligence (AI). The ongoing debate over its uses and potential impact on society has even expanded beyond the tech industry, into the public arena.
Indeed, of the 30 emerging technologies featured in Gartner’s latest Hype Cycle, nine relate to AI. It makes sense, given the scope of potential applications that AI lends itself to. Both AI and machine learning (ML) are set to have an enormous impact on the enterprise in the next few years.
However, the fact is that only a small number of products are bringing real AI or ML into the enterprise space today – though many claim to do so. Like the ‘cloud-washing’ that occurred with the arrival of cloud computing, a growing percentage of software today purports to be ‘AI-enabled’. But AI isn’t magic pixie dust that you sprinkle on your existing applications to make them more intelligent!
We therefore need to move away from this mindset and focus on actual outcomes. Incidentally, this also leads to a broader discussion around what enterprises are doing with AI, compared to what software vendors are doing with AI, because those use cases can be completely different.
Most importantly, we must separate the hype from reality. With the advances we have already witnessed in automation, AI and ML could enable real advancements for our industry, creating self-healing, self-regulating large management systems. However, for this to happen we need a greater degree of pragmatism, and real and honest conversations, and in the case of machine learning especially, more enterprise use cases.
The pros and cons of AI and ML in DevOps
We also need to embrace openness and become better at collaborating. There’s a wealth of data being created and streamed in the average enterprise environment. More than 80% of IT leaders today say data sprawl is one of the most critical problems their organisations must address. Moreover, IT teams still struggle to capitalise on the value of their data, which can dramatically impact the organisation’s bottom line.
If we can figure out a way to collate all those disparate sources of data, we can answer many more difficult – and more interesting – questions, than merely focusing on our own use cases.
One way we can do that is by making ourselves open, for example by enabling easy integration and collaboration with other systems and using an open API. API-first architecture means customers can integrate solutions with their existing tools for monitoring, provisioning, configuration management, and more.
If you’ve ever heard the expression, ‘two sets of eyes are better than one’, it’s the same here. If I can leverage the intelligence that you’re detecting – and combine it with what I am seeing – then we both can reach better outcomes. And this absolutely applies to ML and AI.
The growth of IoT, industrial IoT and edge devices will add to the onslaught of data and will probably be the biggest driver of ML in the enterprise. Enterprises will be asking, ‘What can we learn from all that data coming in, and can we do something interesting with that?’ Imagine the possibilities to interpret, visualise, and produce insights from all that device data. That’s going to be the challenge.
How to control access to IoT data
The model is already becoming more and more sophisticated. If we look at a personal scenario, your bank could potentially recognise and anticipate your behaviour, so it knows when you are travelling, for example. And if your bank could access additional sources of data, such as information from your travel agent or airline, it can avoid issuing a security alert or blocking your credit card because it has detected ‘irregular’ behaviour.
This also applies to the enterprise space; this openness is going to drive much smarter decisions and significantly better outcomes.
Only by prioritising a move towards open, integrated systems, will we realise the potential of machine learning and start to witness self-driving data centres, working towards a pre-set ideal operating state.