Regulators have for some years favoured a risk-based approach to regulation. In a world where economic, cultural and political agendas vary enormously, it makes sense to utilise such a system. Regulation which enables sufficient flexibility to accommodate diversity of approach, but incorporates commonality of purpose, is more likely to lead to widespread adoption and thus achieve its goals.

Flexibility does however bring uncertainty. How do you know whether your assessment of the risks of a transaction will be viewed in the same light some months (or years) later by a regulator? Hindsight and the application of modern standards to historic actions are both tricky rocks to circumnavigate during an investigation; the contemporaneous pressures are often long forgotten. Documenting decisions and evidencing thought processes are both essential steps to take to justify your position.

Now add an extra dimension to the process – ethical implications. From ESG, impact investing and Guernsey’s Green Fund to cardboard straws and exporting our waste, corporate practices are undergoing a radical (and necessary) change. Some areas of regulation have undergone or are facing a similarly radical overhaul, leading to calls for ethical factors to be formally introduced as part of the decision-making process. Technological progress and the proliferation of data are two drivers of this call for change.

It is becoming less acceptable to simply do something because the law does not expressly forbid it. The question “Can I do this?“ should be immediately followed by “Should I do this?”. There may be no criminal or civil penalty attaching to a morally dubious practice, but once news circulates, customers are tending to vote with their feet (or mice). The Cambridge Analytica scandal showed us that there doesn’t have to be a finding of any breach of contract or criminality before a business is terminally impacted by public outrage.

Making ethical considerations central to development in the technology space, and a core consideration before processing our data, is fine on one level. But who decides what those ethical considerations should look like and polices their application? The UK Government recently issued a public statement calling for tech giants to play a greater part in removing “offensive” material from their sites. Do we really want Google/Instagram/Facebook employees imposing their versions of morality on us?

The problem with ethical guidelines is that our views vary enormously. Technology solutions face the same problem. An algorithm can detect and remove material from a website much more quickly and consistently than a human operator, but we have already seen in several cases from the USA, that algorithms are still subject to bias and flaws, just as we are. They are after all, the mechanical personification of their programmers. How is the application of technology to be monitored once it is installed, other than through the courts?

John Hancock, the US insurance business, recently confirmed that it would only be offering its policies to customers who committed to being monitored by a wearable device (such as a Fitbit). Encouraging exercise is the stated purpose, but there are also somewhat sinister implications – what happens if the data it reveals leads to an increase in your premium, based on a judgment of your eating habits? Will your health data be used to subsequently deny you (or your children) cover in the future?

The 40th International Conference of Privacy and Data Protection Commissioners takes place shortly and has a heavy focus on ethics and the future. It may be that the most appropriate route to incorporation of ethical considerations into regulation is through regulation of data. After all, data is the thread which runs throughout all of our daily lives; regulation of data sits as an “umbrella” across all industries.

Even though it is difficult to easily achieve commonality in the area of ethics, we should all be prepared to engage and help set the boundaries. If there are none, our cultural and moral fabric is threatened. That is one area that cannot be replaced by an algorithm.

Share
Twitter LinkedIn Email Save as PDF
More Publications
30 Jul 2021 |

Fighting international fraud

First published in New Law Journal, July 2021. Appleby partners Anthony William and Jared Dann an...

Contributors: Jared Dann, Claire Corkish
20 May 2021 |

The Gender Pay Gap Debate – a response to comments on social media

As a lawyer the majority of articles we write are about a particular case or a legal issue – which...

4 May 2021 |

New Private Investment Funds in Guernsey

In December 2020, the Guernsey Financial Services Commission (Commission) published a consultation p...

Contributors: Oratile Jonas
16 Mar 2021 |

Guernsey Structures - The Cannabis Investment Conundrum

Jurisdictions around the world have adopted different positions in relation to the legality of the c...

12 Mar 2021 |

Material adverse change clauses in light of the Covid-19 pandemic

Experts from each of our key global offices provide jurisdiction specific advice and answer question...

8 Mar 2021 |

Appleby Celebrates International Women’s Day

International Women’s Day is celebrated annually in support of gender equality and equal participa...

23 Feb 2021 |

Fit and Proper in the Channel Islands – A Regulatory Enforcement Update

It is sometimes easy to forget with all that has happened over the last 12 months that there was a w...

27 Jan 2021 |

Levies, registration and all that jazz

Regulatory markets evolve at various speeds and the data protection regime is one example of a marke...

6 Jan 2021 |

Executors navigating the “perfect (company) storm”

Corporate governance has become one of the most hotly debated topics in recent years. Whether it be ...

Contributors: Paula Fry