Ekaterina Khrustaleva, chief operating officer of ImmuniWeb, explores the rise in data privacy legislation and why large companies are still falling foul of those laws
In the near future, Google Analytics could be unusable across the European Union (EU) as European Data Privacy Authorities (DPAs) support the ruling of the French data protection watchdog, the Commission nationale de l’informatique et des libertés (CNIL). The watchdog accused an unnamed local website’s use of Google Analytics of being non-compliant with the EU’s main privacy law, the General Data Protection Regulation (GDPR).
The CNIL accused the website of violating Article 44, which covers personal information transfers outside EU to countries that have not essentially equivalent privacy protections. The list of countries lacking such protections includes the United States, with its tough surveillance laws. These laws fail to provide non-US citizens with information on collecting and using their data.
Google Analytics is a popular cloud tool that helps website operators to evaluate user engagement. Google states that all information collected by the service is anonymised, but CNIL isn’t satisfied with this statement. As per the French data protection watchdog, with the help of a unique identifier assigned to users and the data collected in association with it, the tech giant would be able to personally identify people.
While the order to stop using Google Analytics applies only to a French local website, this could mean that in the near future the service could be unusable across the EU, because European DPAs support the ruling. The only way for Google to avoid this is to change the way its service operates. The CNIL ruling allows the use of the service, but only if the transferred data is completely anonymous and statistical.
In January 2022, the Austrian DPA also determined that the data collected by Google Analytics is enough to potentially identify individuals (the English version of the ruling is available here). The watchdog has found that the service collection of IP addresses and identifiers in cookie data violates GDPR. Google Analytics does have the IP addresses anonymising option, but this must be requested by its user. So, even if the website tried to enable anonymisation but failed to configure it correctly, this would be considered as a GDPR violation. That, combined with the recent French violation, does not bode well for US cloud services in the EU.
Violations of certain provisions of GDPR could cost companies up to €20 million, or 4% of their total worldwide annual revenue of the preceding financial year. Fines for data security and protection violations may go up to €10 million or 2% of the total worldwide annual revenue of the previous financial year. In some notorious cases, the fine was eventually reduced from hundreds of millions to a significantly smaller amount, however, for different reasons unrelated to the gravity of the violation. Different reports show that there is no consistency between GDPR fines and enforcement priorities among European DPAs. The European Data Protection Board should probably bring more clarity and uniformity to the context by issuing additional guidelines on fines.
How data protection can benefit from artificial intelligence
As of January 2022, over 900 GDPR fines were issued across the EU and the UK. The biggest fine was issued to Amazon in 2021 – €746 million. The next is WhatsApp, with a €225 million GDPR penalty for its cookie policy, which forced WhatsApp to rewrite its privacy policy for European users. In January 2022, the CNIL hit Google with €150 million fine related to the implementation of cookie consent procedures on YouTube. At the same time, CNIL also fined Facebook because users in France couldn’t refuse cookies “as easily as to accept them”.
Additionally, the individuals, whose privacy rights under GDPR were violated, have the right to file a civil lawsuit in their country of residence or employment and claim financial compensation for material or non-material damage.
Under the GDPR, personally identifiable information (PII) must be “processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures (‘integrity and confidentiality’)”.
It is not only the EU that has strict data privacy legislation though. For instance, the Californian Privacy Rights Act (CPRA), which came into force on 1 January 2020, prohibits so-called Dark Patterns — various “tricks” in the user interface that hinder or prevent users from providing a voluntary consent.
Moreover, privacy legislation becomes even stricter every day, imposing additional data protection requirements or expanding the definition of what is considered to be personal data. Under the Californian CPRA, the user interface requirement is for all data consents to take just one second to accept all cookies and 20 seconds to reject all cookies. Similar privacy enhancements will likely be implemented by law in many other countries soon. Thus, large companies, like Apple, that have implemented the new legislation into their software serve as a laudable example to other vendors.