Over the past year, I have attended a number of events in the US and the UK sponsored by their respective open data communities.
Clearly, the UK is well ahead of the US – as well as many other countries across the globe – in seeing its vision of empowering citizens and organisations via the wide availability of government-created data come to fruition.
I recently attended the Open Data Institute’s annual summit in London, where all of the major participants in this community – including government, academia and industry – gathered to learn and celebrate their accomplishments to date.
>See also: Mining the knowledge network: the business case for open data
The event included a set of awards given to those individuals and start-ups that have taken the most innovative approaches to exploiting open data for commercial applications, as well as a pre-day of training for all levels of open data users.
In my opinion, this concept has legs – and as long as the funding, citizen advocacy and corporate support remain strong, there will be a bright future for all involved.
However, in spite of the UK’s success to date, everything is far from perfect in the world of open data across the rest of the globe. There continue to be numerous challenges and impediments in seeing any vision of open data come to fruition. Many are technical in nature, but there are a number of cultural ones.
First is the extremely poor quality of the data products being provided by government entities, including the lack of appropriate metadata to add the necessary historical and use-case context. Then there is the limited range of rich data products available from government in spite of mandates to do so, copyright wavers and public interest. And finally, the timeliness and latency of data products in respect to the currency of events.
These impediments to success are in most cases products of a culture of fear and retribution found in most bureaucracies.
Most government agencies still struggle to make their internal systems fit for purpose in respect to the fundamental services they deliver, and require an inordinate level of analysis, remediation and reconciliation to meet the service delivery levels associated with their individual missions.
Exposing this data to others with little control over its use is a frightening scenario for far too many of them.
This has been the biggest obstacle to overcome in the US so far in spite of hundreds of billions of dollars spent on IT architecture and applications over the past decade. This should come as no surprise to any enterprise architects regardless of the sector they work in.
Much work is being done to align legislated mandates with specific behaviours and deliverables using internal task forces and direct intervention by senior civil servants.
I believe that in spite of strong resistance there is no going back in terms of becoming closed once again.
Strong commercial applications are being proffered, and the industry sectors representing these organisations have strong lobbyists working on their behalf to foster these interests with appropriate funding mechanisms.
>See also: Met Office looks to enhance weather services by publishing more open data
One dirty little secret in most US agencies is that they are mandated to provide open data by a certain date but have not been given any additional headcount or funding to do so.
Open data is a growing force in the world of big data and analytics. It brings new assets into the mix for developers and service providers to use in providing feature-rich apps and services for their customers.
Unfortunately, like all other data sources, it suffers from major issues that good governance and provenance practices could easily surmount.
We must all remember that data is an asset and must be treated accordingly.