One of the great things about working in the tech sector is that you genuinely get involved in thoughtful conversations. This piece is about one such conversation that shines a light on a common fallacy of data privacy in a digital age — consent.
What is data privacy anyway?
The debate discussed here was predicated upon a particular notion of what it means to offer a customer a privacy-enhanced service. Data privacy, in this context, is about user choice and transparency in data use, so the usual definition goes. Give the user the choice in sharing their personal data and you’ve ticked a key privacy box. Of course, the world and its machinations are rarely black and white. In fact, it was the less than granular nature of Google and Facebook’s consent process that landed them with a GDPR complaint from Max Schrems of Noyb. Noyb took Google and Facebook to task over ‘forced consent’ rather than freely given consent. In other words, if the user decided they didn’t want to hand over their consent, belt and braces, they wouldn’t get the service, end of story. This is a ‘wild west’ approach to meeting the notion of privacy and stretches the idea of consent to breaking point.
The GDPR has begun a frantic discourse across the globe on how to achieve Privacy by Design and Default
Data privacy must be more than an on/off consent switch. Because data privacy lies at the heart of how we transact via our digital persona, it needs to have considerations that go beyond the purely legal or the purely technical. Privacy needs a social prism.
Here is how the debate panned out…
The big privacy debate — consent, implied consent, GDPR and consent
I noted that in some Twitter and LinkedIn conversations, that data privacy was being linked to the monetisation of data. I first noticed this shortly after some major data breaches, including the Facebook/Cambridge Analytica debacle. Industry folks were posting ideas about setting a payment level in exchange for varying levels of privacy. This was as a way to manage the privacy issues on free platforms like social media.
Last week, I became involved in a Twitter debate about a related area. This time it developed into a direct monetization of data debate. A tweet from Ann Cavoukian, privacy pioneer extraordinaire and inventor of Privacy by Design, said this:
“Legislation lags behind technology, every time. Digital property rights in the form of data ownership would restore personal control on the part of the data subject, and enable them to decide how their information would be used — it’s their decision to make!”
Legislation lags behind technology, every time. Digital property rights in the form of data ownership would restore personal control on the part of the data subject, and enable them to decide how their information would be used — it’s their decision to make! https://t.co/Pd3JKWyFAy
— Ann Cavoukian, Ph.D. (@AnnCavoukian) 8 April 2019
Someone then asked about monetisation to trade data.
The reply being: “Surely that should be up to the data subject — it’s their information, to do with as they wish. If they wish to obtain remuneration in some form for the uses of their data, who are we to tell them no? Elitism has no place here. It’s all about personal control over one’s data.”
Surely that should be up to the data subject — it’s their information, to do with as they wish. If they wish to obtain remuneration in some form for the uses of their data, who are we to tell them no? Elitism has no place here. It’s all about personal control over one’s data. https://t.co/UY0pfWzp0y
— Ann Cavoukian, Ph.D. (@AnnCavoukian) April 8, 2019
At this point, I had to wade in.
I pointed out that, whilst in a perfect world this was fine, in a less than perfect one we would be creating a tiered privacy system; the wealthy having the choice to retain data privacy rights whilst those in need having less choice.
Choice suddenly becomes less black and white and more 50 shades of grey; along the lines of, “I made the choice to sell my data because my baby needed food.”
Blockchain and privacy: Can a form of distributed ledger solve the problem of privacy?
50 shades of privacy
I will introduce you to Malcolm Crompton before I begin. Malcolm has the kind of resume that most of us dream of. Just so you understand a few of his privacy credentials, Malcolm is the ex-Privacy Commissioner for Australia and Director of International Association of Privacy Professionals (IAPP) from 2007 to 2011. I asked Malcolm what he thought about the two sides of the privacy by consent debate?
Malcolm pointed to Joni Brennan (President of Digital ID & Authentication Council of Canada
(DIACC)). During the debate Joni stated this: “Let’s have a positive-sum discussion. Subjects should have the right to self-monetise their data AND system designers (engineers, policy makers, etc) should be cognizant regarding the unintended effects this may have on populations at risk of exploitation.”
This nuanced and positive approach to the debate struck me. Joni had hit on an important factor in digital system design – design for people, placing choice as a design remit.
Malcolm pointed to Joni’s tweet because of her reference to folks in situations of financial stress or subject to family violence, etcetera. He pointed to a new law in Australia, the so-called “Consumer Data Right“.
Malcolm describes this as being: “supposedly all about the civil libertarian concept of ‘you will have to be told everything and hence you are free to make a decision as an equal party to the bank (or whatever) to negotiate your terms’.
Yeah right! “
Malcolm hit the high note. Privacy is a precious commodity and one that is easily exploited. Privacy is as precious as the data and the person behind the data, that it sets out to protect. And it is a commodity that cannot be easily encapsulated. The problem then is that privacy will become obfuscated by legalities that even those well-versed in the industry mantras struggle to understand.
Malcolm Crompton: “[The law] makes assumptions that folks have the time to understand what they are told; folks have the capability to analyse the consequences both short term and long term; folks have the strength (including economic independence) to negotiate the terms of the exchange; folks have alternatives etc.”
Malcolm continued to unpick some of the issues around the Australian situation – which, of course, is applicable worldwide.
He stressed that in terms of understanding the concept behind the freedom to make choices around data sharing that: “[The law] makes assumptions that folks have the time to understand what they are told; folks have the capability to analyse the consequences both short term and long term; folks have the strength (including economic independence) to negotiate the terms of the exchange; folks have alternatives etc.”
We risk a digital crisis in 2019 akin to the 2008 banking crisis, warns data privacy lawyer
Consent, in its purest form, could easily become a dystopian stick to control citizens with. If it was the bank instead of Facebook or Google who enforced this new ‘right’. Or a landlord or an employer or a school or a hospital – they could easily manipulate the elastic nature of any laws with ‘consent’ as a basis. Consent to give us access to whatever personal and behavioral data needed, or woe betide; the result of choosing to withhold consent – no bank, no home, no job, no school? Or reduced services, higher loan fees, poorer healthcare?
The true conflict for privacy — Choice versus consent
So, where does this leave us? Data collection and processing is not getting simpler. Commerce is increasingly using new techniques in the use of data. Techniques like automated data collection and the application of machine learning and deep learning to data analysis. Our digital identity is intrinsically and deeply linked to all of these data. Either directly, as attributes defining our digital nature or indirectly, for example, in our interactions with internet connected devices (IoT). Data privacy is a digital fractal that just keeps getting more and more complex the deeper in we look.
We need structures that tease out the data we collect, its use, how we share it, and the legal frameworks of the businesses that use it. The GDPR has begun a frantic discourse across the globe on how to achieve Privacy by Design and Default. But it has not resolved the conflict between choice and consent. As individuals, we need to have structures that allow us to balance choice and consent. This is a design conundrum as much as a legal one.
As Malcolm put it: “Almost everything about us is made more trustworthy because OTHERS have made the decisions for us: car safety, plane safety, your doctor, your clothing, your building, on and on and on.
“But personal information is to be treated differently. No, we won’t require organisations to be more responsible and less harmful in their use of information about you. Instead, we will give you ‘control’. Well, just as I would crash the plane if I was the pilot, we are all heading for a massive crash on data.”
GDPR anniversary: has the regulation backfired? What next?
Leveling privacy
Software designers must understand the subtle nuances of how privacy can be exploited. This happens at the point where the digital world reaches out and touches the real one. Design decisions have to take legal frameworks, like GDPR, into account. But they should also be taking people into account too. We are all on a journey through life, our privacy goes with us. My decisions about who takes a digital piece of me should be mine to make without coercion. In 1649, the British group of radicals known as The Levellers pushed for civil rights and liberties. They published a paper to this effect called the ‘Agreement of the People’ . It is now the time for our own ‘Agreement of the People’ but this time about the liberation of our data privacy. Privacy by Design is about designing for people, no matter what their circumstances – and good digital platforms must be great levelers.
Bio: Susan Morrow is an ex-scientist who moved into cybersecurity, consumer identity, and data privacy. She has worked in the tech sector for 25 years and her interests include how human behaviour informs technology design.
Nominations are OPEN for the Tech Leaders Awards, organised by Information Age and taking place on 12th September 2019 at the Royal Lancaster, London. Categories include CIO of the Year, CTO of the Year, Digital Leader of the Year and Security Leader of the Year. Recognise and reward excellence in the tech industry by submitting a nomination today