Regulators have for some years favoured a risk-based approach to regulation. In a world where economic, cultural and political agendas vary enormously, it makes sense to utilise such a system. Regulation which enables sufficient flexibility to accommodate diversity of approach, but incorporates commonality of purpose, is more likely to lead to widespread adoption and thus achieve its goals.

Flexibility does however bring uncertainty. How do you know whether your assessment of the risks of a transaction will be viewed in the same light some months (or years) later by a regulator? Hindsight and the application of modern standards to historic actions are both tricky rocks to circumnavigate during an investigation; the contemporaneous pressures are often long forgotten. Documenting decisions and evidencing thought processes are both essential steps to take to justify your position.

Now add an extra dimension to the process – ethical implications. From ESG, impact investing and Guernsey’s Green Fund to cardboard straws and exporting our waste, corporate practices are undergoing a radical (and necessary) change. Some areas of regulation have undergone or are facing a similarly radical overhaul, leading to calls for ethical factors to be formally introduced as part of the decision-making process. Technological progress and the proliferation of data are two drivers of this call for change.

It is becoming less acceptable to simply do something because the law does not expressly forbid it. The question “Can I do this?“ should be immediately followed by “Should I do this?”. There may be no criminal or civil penalty attaching to a morally dubious practice, but once news circulates, customers are tending to vote with their feet (or mice). The Cambridge Analytica scandal showed us that there doesn’t have to be a finding of any breach of contract or criminality before a business is terminally impacted by public outrage.

Making ethical considerations central to development in the technology space, and a core consideration before processing our data, is fine on one level. But who decides what those ethical considerations should look like and polices their application? The UK Government recently issued a public statement calling for tech giants to play a greater part in removing “offensive” material from their sites. Do we really want Google/Instagram/Facebook employees imposing their versions of morality on us?

The problem with ethical guidelines is that our views vary enormously. Technology solutions face the same problem. An algorithm can detect and remove material from a website much more quickly and consistently than a human operator, but we have already seen in several cases from the USA, that algorithms are still subject to bias and flaws, just as we are. They are after all, the mechanical personification of their programmers. How is the application of technology to be monitored once it is installed, other than through the courts?

John Hancock, the US insurance business, recently confirmed that it would only be offering its policies to customers who committed to being monitored by a wearable device (such as a Fitbit). Encouraging exercise is the stated purpose, but there are also somewhat sinister implications – what happens if the data it reveals leads to an increase in your premium, based on a judgment of your eating habits? Will your health data be used to subsequently deny you (or your children) cover in the future?

The 40th International Conference of Privacy and Data Protection Commissioners takes place shortly and has a heavy focus on ethics and the future. It may be that the most appropriate route to incorporation of ethical considerations into regulation is through regulation of data. After all, data is the thread which runs throughout all of our daily lives; regulation of data sits as an “umbrella” across all industries.

Even though it is difficult to easily achieve commonality in the area of ethics, we should all be prepared to engage and help set the boundaries. If there are none, our cultural and moral fabric is threatened. That is one area that cannot be replaced by an algorithm.

Twitter LinkedIn Email Save as PDF
More Publications
27 Sep 2022

Similar but Different

While the basic features of the trust remain, there are some notable differences in how trusts can b...

7 Sep 2022

ESG Series Part 1: Climate Change – What on Earth is going on?

‘ESG’ has well and truly arrived, and has triggered a new age in business and financial investme...

7 Jun 2022

New Regulations and Requirements for Local Charities

The Charities etc. (Guernsey and Alderney) Ordinance, 2021 (Ordinance) and the raft of regulations t...

Contributors: Lisa Upham
20 May 2022

Lasting Powers of Attorney

The long-awaited Capacity (Lasting Powers of Attorney) (Bailiwick of Guernsey) Ordinance, 2022 (LPA ...

23 Feb 2022

Anonymisation of decisions: an invitation to consider this more but the unscrupulous need not apply!

The adage that ‘justice must not only be done, but must also be seen to be done” derives from a ...

7 Dec 2021

Notaries, E-Apostilles and Technological Changes

Notaries form the oldest branch of the legal profession. Their origins can be traced back to the Ro...

25 Nov 2021

Regulatory Approach to ESG across the Crown Dependencies

New requirements may require investment products to display a label reflecting their sustainability ...

5 Oct 2021

Notaries: Are Simple Certifications a Thing Anymore?

Notaries are primarily concerned with the authentication and certification of signatures, authority ...

30 Jul 2021

Fighting international fraud

First published in New Law Journal, July 2021. Appleby partners Anthony William and Jared Dann an...

Contributors: Jared Dann, Claire Corkish
20 May 2021

The Gender Pay Gap Debate – a response to comments on social media

As a lawyer the majority of articles we write are about a particular case or a legal issue – which...