The late Douglas Adams once observed in his Hitchhikers' Guide to the Galaxy: ‘Everything that exists before you are born you simply take for granted and everything that comes along before you are about 35 is obviously an opportunity for career advancement but everything that arrives after that time is clearly the work of the devil and must be resisted with every fibre in your body.’
A little harsh perhaps, but it neatly captures the challenge of building a global digital economy. There are still too many mature and successful organisations (and their leaders) that don't see the pressing need to embrace a digital future when the present appears to serve them very well. Yet the proof that digital technology changes lives, creates wealth and can transform society is all around them.
Like it or not, digital technology is the platform on which business and every industrialised economy runs today. That is not to say that technical skills are no longer important, but we are starting to see the benefits of a broader mindset encompassing the humanities, sociology, anthropology, the arts, psychology and other ‘softer’ disciplines. The technology is available, affordable and accessible, and it just works — now it is time to let our creativity and imagination find new possibilities.
Creativity and imagination are two of the attributes that set us, as humans, apart from the machines that we have created. Computers can process information faster and more reliably than humans can ever hope to.
A new generation of ‘thinking machines’, such as IBM's Watson, can discover answers to questions that they have not been specifically programmed for, providing decision support to busy professionals.
Computers can drive cars (better than humans, although most people remain sceptical), fly airplanes, and perform a whole range of tasks that we previously thought only humans could do. But can they be truly creative? Can they imagine? Now there is a question to ponder.
>See also: Man and machine: Cognitive computing in the enterprise
More pressing, however, is the question of responsibility. In our increasingly litigious world, someone must shoulder the blame when a machine, however smart, appears to make a mistake. And, as the scope of smart machines extends to ever more significant decisions with potentially serious implications for humans, this is an issue that cannot be ignored.
In the past (from the days of the industrial revolution), automation and technology have replaced humans as a workforce. Now, they are on the brink of doing the same to many administrative, clerical and knowledge-based workers.
This is a significant step and one with profound implications for our society. If we are to survive and prosper in a future where many of the more highly paid jobs are replaced by machines, then we had better start thinking very carefully about what we are really good at (and what is better left to the machines) — so that we can live in productive harmony with the technology we have unleashed.
Digital brings profound changes in so many ways, and some of them are very challenging. Digital undermines the old established idea of privacy and makes transparency the default. Everything is available and accessible 24/7 — with sometimes unsettling consequences.
Digital supports extreme granularity, as we capture billions of discrete data points and tweet our random thoughts. Digital supports billions of devices, from the traditional PC to the smartphone and wearable devices and, in the coming years, tens of billions of intelligent connected devices that will bridge the divide between the physical world and the digital world.
Sensor networks and the Internet of Things will blur the divide between the real and the virtual, giving rise to new business opportunities but further eroding the concept of privacy and amplifying the risks of cyber attacks.
Over the next few years, healthcare is likely to be the biggest target opportunity for digital transformation, as rising costs and the rapidly increasing life expectancy in industrialised economies force a move towards wellbeing and fitness and away from simply healing the sick.
The challenge is also going to be in data. A unified database of health records could provide a goldmine of data for analysis and help to deliver new treatments, spot unrecognised issues and offer a wealth of other benefits. But, because of a lack of adequate reassurance over privacy and misuse, most citizens will only see the loss to them and never the benefit to society as a whole. And with the advent of increasing DNA data, the stakes are going to get a lot higher.
>See also: Next big thing: Preparing for the Internet of Things in the enterprise
Data is at the core of the emergent digital industrial economy as billions of devices observe and control every action and every object in our lives. Individuals, businesses and governments must learn to live in a world where transparency has replaced privacy and secrets are few and far between.
There is a growing tension, which can only get more intense as time passes, between the desire to capture and analyse the avalanche of data to deliver competitive advantage, and the risk of reputational damage when that same data is lost, leaked or used inappropriately. In the socially connected online world of today, reputational damage is a real risk.
Moreover, it is all happening so fast. The pace of change is a real issue for big businesses where the culture, organisational structure and business processes have all evolved to support stability and manageable change.
Product and process cycles that were measured in years must now adapt in months, if not weeks or even hours. Regulatory and legal frameworks are slow to develop and slow to change, getting left behind in the relentless charge towards the digital future.
Many established leaders believe that the old rules will prevail and the barriers to entry and success will continue to hold. They would do well to remember the fable of the vizier who asked for payment from his king in the form of grains of wheat on a chessboard. One grain on the first square, two on the second, four on the third, and so on. Digital is like that; by the time it looks big enough to be a threat, it is already too late to recover.