The story of data and analytics is one that keeps evolving; from appointing chief data officers to procuring the latest analytics software, business leaders are desperately trying to utilise it, but it’s not easy.
“The size, complexity, distributed nature of data, speed of action and the continuous intelligence required by digital business means that rigid and centralised architectures and tools break down,” says Donald Feinberg, vice president and distinguished research analyst at Gartner. “The continued survival of any business will depend upon an agile, data-centric architecture that responds to the constant rate of change.
But while business leaders have to tackle digital disruption by looking for the right services and technology to help streamline their data processes, unprecedented opportunities have also arisen. The sheer amount of data, combined with the increase of strong processing capabilities enabled by cloud technologies, means it’s now possible to train and execute algorithms at the large scale necessary to finally realise the full potential of AI.
According to Gartner, it’s critical to gain a deeper understanding of the following top 10 technology trends fuelling that evolving story and prioritise them based on business value to stay ahead.
Trend #1: Augmented analytics
Gartner says by 2020, augmented analytics will be the main selling point for analytics and BI solutions.
Using machine learning and AI, augmented analytics is considered, by Gartner, as a disrupter in the data and analytics market because it will transform how analytics content in developed, consumed and shared.
Trend #2: Augmented data management
Augmented data management utilises machine learning capabilities and AI technology to make data management categories including data quality, master data management, metadata management, data integration as well as database management systems (DBMSs) self-configuring and self-tuning.
The top 10 strategic technology trends for 2019, according to Gartner
According to Gartner, this is a big deal because it automates many of the manual tasks opening up opportunities for less technically skilled users to use data. It also helps highly skilled technical resources to focus on more value-adding tasks.
Through to the end of 2022, manual tasks in data management will be cut by 45% thanks to ML and automated service-level management.
Trend #3: Continous intelligence
Continues data is more than a new way to say real-time data. Instead, it’s about a design pattern where real-time analytics are combined with business operations, processing current and historical data to prescribe actions in response to events.
“Continuous intelligence represents a significant change in the job of the data and analytics team,” says Rita Sallam, research vice president at Gartner. “It’s a grand challenge — and a grand opportunity — for analytics and BI (business intelligence) teams to help businesses make smarter real-time decisions in 2019. It could be seen as the ultimate in operational BI.”
By 2022, more than half of significant new business systems will incorporate continuous intelligence that uses real-time context data to improve decisions.
Trend #4: Explainable AI
AI is already being increasingly used in data management, but how do AI solutions explain why they came to certain conclusions? This is where explainable AI comes in.
Explainable AI in data science and ML platforms is about generating an explanation of data models in terms of accuracy, attributes, model statistics and features in natural language.
Explainable AI: The margins of accountability
Trend #5: Graph
According to Gartner, graph analytics is a set of analytic techniques that help enterprises explore the relationships between entities of interest such as transactions, processes and staff.
The application of graph processing and graph database management systems will grow at 100% annually through 2022.
Trend #6: Data fabric
Data fabric is all about a single and consistent data management framework. It looks at enabling frictionless access and sharing of data in a distributed data environment as opposed to siloed storage.
Through 2022, bespoke data fabric configurations will be used primarily as a static infrastructure, forcing organisations into a new stream of cost to completely re-design for more dynamic data mesh approaches.
Trend #7: NLP/ conversational analytics
By 2020, 50% of analytical queries will be generated via search, natural language processing (NLP) or voice, or will be automatically generated. The need to analyse complex combinations of data and to make analytics accessible to everyone in the organisation will drive broader adoption, allowing analytics tools to be as easy as a search interface or a conversation with a virtual assistant.
The use cases are so vast that the NLP market is anticipated to be worth $13.4 billion by 2020, according to a separate study.
NLP to break down human communication: How AI platforms are using natural language processing
Trend #8: Commercial AI and ML
Gartner says by 2022, 75% of new end-user solutions leveraging ML and AI techniques will be built with commercial solutions rather than open source platforms.
Commercial vendors have created connectors into the Open Source ecosystem, and they provide organisations with features necessary to scale and democratise AI and ML such as project and model management, transparency, reuse, data lineage, platform cohesiveness and integration that Open Source technologies lack.
Trend #9: Blockchain
Distributed ledger technologies such as blockchain are looking promising in the area of data analytics due to the possibility that they provide decentralised trust across a network of untrusted participants.
The ramifications for analytics use cases are major, especially those leveraging participant relationships and interactions.
But, according to Gartner, it’s going to several years for blockchain to take off in this area fully. In the meantime, enterprises will instead partly integrate with blockchain technologies and standards which will likely be dictated by their dominant customers or networks. This includes integration with your existing data and analytics infrastructure.
Can businesses use blockchain to solve the problem of data management?
Trend #10: Persistent Memory Servers
Persistent-memory technologies aim to reduce costs and complexity of adopting in-memory computing (IMC)-enabled architectures. Persistent memory represents a new memory tier between DRAM and NAND flash memory that can provide cost-effective mass memory for high-performance workloads.
According to Gartner, it has the potential to upgrade application performance, availability, boot times, clustering methods and security practices. It will also help organisations reduce the complexity of their application and data architectures by decreasing the need for data duplication.
“The amount of data is proliferating and the urgency of transforming data into value in real-time is growing at an equally rapid pace,” says Feinberg. “New server workloads are demanding not just faster CPU performance, but massive memory and faster storage.”
Related: A look at the biggest data analytics trends in 2019
Nominations are OPEN for the Tech Leaders Awards, organised by Information Age and taking place on 12th September 2019 at the Royal Lancaster, London. Categories include CIO of the Year, CTO of the Year, Digital Leader of the Year and Security Leader of the Year. Recognise and reward excellence in the tech industry by submitting a nomination today