Technology and privacy – is the GDPR already out of date?

Published: 31 Oct 2018
Type: Insight

 First published by the Cayman Financial Review, October 2018


Nothing challenges the effectiveness of privacy laws like technological innovation. As the volume of data being generated about individuals increases, technology is making it easier than ever for data to be captured and analyzed, making that data ever more valuable.

Unfortunately, technology also introduces new and previously unknown threats. As such, how companies collect, process and protect the personal data of their customers, staff and suppliers has become a key challenge.

The General Data Protection Regulation (GDPR), which came into effect on May 25, 2018, is the European Union’s legislative response to this challenge. Drafted to be “technology neutral,” the GDPR is intended to give individuals better control over their personal data and establish a single set of data protection rules across the EU, thereby making it simpler and cheaper for organisations to do business. So far, so sensible. Unfortunately, technology always runs ahead of the law and the GDPR is already starting to show some of its limitations as the law clashes with newer technologies.

Blockchain technology

Blockchain – or distributed ledger technology – replaces the centralised transaction database with a decentralised, distributed digital ledger where each and every transaction flowing through it is independently verified against other ledgers maintained by different parties, in different locations. In this way, the record of any single transaction cannot be altered without changing all subsequent transactions or “blocks” that are chained together across the entire distributed ledger. It is this immutability that ensures the reliability of the information stored on the chain.

The GDPR gives data subjects the right to request that their personal data is either rectified or deleted altogether. For blockchain projects that involve the storage of personal data, these legal rights do not mix well with the new technology. Drafted on the assumption that there will always be centralised services controlling access rights to the user’s data, the GDPR fails to take into account how a permissionless blockchain works. Ultimately, this may mean that blockchain technology cannot be used for the processing of personal data without potentially falling foul of the GDPR.

Interestingly, blockchain technology provides its own potential solution to this problem by allowing personal data to be kept off the various ledgers altogether. It does this by replacing the personal data with an encrypted reference to it – a “hash.” These hashes, or digital fingerprints, prove that the data exists, but without the data itself appearing on the chain.

Problem solved? Unfortunately not. The GDPR draws an unhelpful distinction between pseudonymised and anonymised data. Pseudonymisation occurs where personal data is subjected to technological measures (like hashing or encryption) so that it no longer directly identifies an individual without the use of additional information. Anonymisation on the other hand, results from processing personal data in order to irreversibly prevent identification. As such, anonymised personal data falls outside the scope of the GDPR, whereas pseudonymised data – including hashed data – does not.

Unlawful algorithms

Social media sites and search engines specialise in algorithms that allow them to target advertisements at users. However, the way those algorithms work makes all the difference and reveals an unintended consequence of the GDPR’s drafting.

Take the example of Bob. Bob decides to buy a new car by doing all of his research using an internet search engine. He then posts details of his new purchase on social media. The algorithms for Bob’s social media site correctly profile Bob as someone who is likely to buy car products or access car-related services in the future. Bob will therefore start to see targeted adverts on his social media page. Following his hours of online research for a new car, the algorithms used by his chosen search engine reach the same conclusion and Bob will also start to see some of those same adverts each time he goes online. While the resulting adverts Bob receives may be the same, the way the algorithms achieve this result is very different.

Social media algorithms target adverts by knowing who you are, whereas search engines target adverts by knowing what you are searching for. The who versus what dichotomy is therefore critical under the GDPR. Social media sites know which adverts to show Bob because they analyse his profile and hold personal data about him. The algorithms for most search engines, on the other hand, look only at what Bob searched for. The only data those engines need to target their advertising to Bob is to know that somebody in a particular geographic area used the search term “new car.” The engines have no idea that it was Bob searching for a new car, just that someone did. Search engines can therefore ignore personal data and still achieve the same algorithmic precision, social media sites cannot.

Should Bob be required to give his consent to this use of his data before it is used in this way? Under the GDPR, arguably yes, but only for the way the social media site uses his data. Bob has no ability to stop his chosen search engine using the data it holds because that data is not considered “personal data.”

Artificial intelligence

Artificial intelligence relies on machine learning, but for machines to learn, they need to crunch data, and lots of it. The GDPR makes it more difficult for those machines to get the data in the first place and once they have the data, rights granted to data subjects under the GDPR could also make it difficult for companies to reap the full benefits of machine learning.

The volume of data available for machine learning is not a problem, but under the GDPR, using that data lawfully often will be. This is because those developing machine learning will often be data processors rather than data controllers. Data processors are not permitted to decide for themselves how personal data is used, they can only use the data as directed to do so by the data controller and with the consent of the data subject.

Assuming consent is obtained and the machines learn from the data they consume, the output those machines then generate may also be restricted by the GDPR. This is because data subjects have a right under the GDPR not to be subject to a decision based solely on automated processing if that decision significantly affects the data subject. In other words, much of the ability to allow machines to make automated decisions will be linked to how those decisions affect our lives. Automated decisions about our shopping habits will probably be fine but automated decisions which determine a career promotion or mortgage application are likely to be challenged in the future.

Conclusion

With the GDPR now in force, not only is the long arm of EU data protection law reaching beyond the EU’s borders, potentially it is also impacting our use of new technologies.

Technology will not stop to adjust to the new laws, which means legal frameworks like the GDPR need to remain flexible enough to strike a balance between technological progress and the protection of individual privacy.

Share
More publications
Appleby-Website-Dispute-Resolution-Practice
11 Feb 2026

When the Court intervenes… and when it does not: Grand Court Reaffirms Limited Curial Intervention in Support of Foreign Arbitrations

The Financial Services Division of the Grand Court’s judgment in In the matter of A v B & C (FSD 270 of 2025) provides a timely reminder of the proper boundaries between national courts and international arbitration tribunals in respect of the grant of interim relief. The decision underscores the Cayman Islands' commitment to the principle of limited curial intervention and confirms that the Court’s powers under section 54 of the Arbitration Act 2012 are ancillary to the arbitral process and are only to be exercised when the tribunal cannot provide effective relief itself. The judgment helpfully sets out clear parameters for those seeking ancillary relief and highlights that the Cayman courts will support arbitration proceedings without supplanting them.

Website-Code-Cayman-2
5 Feb 2026

Recusal For Apparent Bias Is Not A New Frontier

In Re New Frontier Health Corporation,[1] Justice Doyle decided to recuse himself, such that he would not hear the trial listed to commence weeks later, on the basis that he made findings in his recent Re 51job Inc judgment, as to the reliability and credibility of the same two experts who would give evidence at the New Frontier trial. The New Frontier judgment represents a further endorsement by the Cayman courts of the fundamental maxim that justice must not only be done, but must be seen to be done.

Appleby-Website-Corporate-Practice
4 Feb 2026

The New Crypto-Asset Reporting Framework – Relevance for Cayman Investment Funds

The Tax Information Authority (International Tax Compliance) (Crypto-Asset Reporting Framework) Regulations, 2025 (CARF Regulations) came into effect on 1 January 2026 and provide for the collection, reporting and automatic exchange of information on transactions in crypto-assets.  The CARF Regulations will operate in a similar fashion to the existing Cayman Common Reporting Standard (CRS) regime which facilitates the automatic exchange of financial account information.  For information on recent changes to the CRS, please see our December advisory here.

Appleby-Website-Regulatory-Practice
27 Jan 2026

CIMA Launches Prudential Information Survey for SIBA Registered Persons

The Cayman Islands Monetary Authority (CIMA) has published a General Industry Notice launching a new Prudential Information Survey for Registered Persons under the Securities Investment Business Act (SIBA) of the Cayman Islands.

Appleby-Website-Dispute-Resolution-Practice
15 Dec 2025

Aquapoint LP v Fan: Privy Council Confirms Equitable Constraints Can Override Strict Contractual Rights in Cayman ELP Winding Up

In its recent judgment in Aquapoint LP (in Official Liquidation) v Fan,[1] the Privy Council upheld the judgments of the Grand Court and Cayman Islands Court of Appeal (CICA). The ruling confirms that the exercise of strict legal rights under a limited partnership agreement – even one containing detailed contractual terms and “entire agreement” clauses – can nevertheless be subject to equitable considerations in certain circumstances. Where those equitable considerations arise, they may justify the winding up of an exempted limited partnership on the “just and equitable” basis. Appleby acts for the joint official liquidators of Aquapoint; for further details on the background of this case, see Appleby’s previous article here.