Contracts to Manage AI Risk (Bermuda)

Published: 22 Jul 2025
Type: Insight

This is the first of a two-part article on how artificial intelligence contracts can be used to manage the development and use risks associated with such transformative technology.

All transformational technology is fraught with risk. However, when it comes to AI, Elon Musk said “we are communing with the devil”.

As a commercial lawyer, I like to think that fables about contracts with the devil exist because contracts are the ultimate risk management tool.

Regulators around the world, including the Bermuda Monetary Authority, fully appreciate the importance of using contracts to mitigate, if not avoid, the risks associated with the development and use of transformative technologies.

The BMA’s 2025 Business Plan signalled future policies concerning the risk-managed use of AI by its registrants.

That plan stated that the BMA was “undertaking a review of the Insurance Code of Conduct and the Operational Cybersecurity Code of Conduct to consider the merits of integrating specific guidelines on the use of AI and machine-learning systems”.

As with the BMA’s regulatory requirements for outsourcing and cybersecurity governance, I fully expect that the evolution of those current regulations will include additional risk management guidance involving AI contracts that are relied upon by Bermuda’s financial service sector.

In that regard, the recent emergence of model contracts for all sectors to manage the many risks of AI has been striking.

Among the variations of AI model contracts that I have consulted, there are two that stand out.

In 2023, Britain’s Society for Computers and Law published a 59-page White Paper titled Artificial Intelligence Contractual Clauses, and recently the Digital Transformation Agency of the Australian Government published [AI] Model Clauses, Version 2.0. Both are excellent.

Both organisations take a pragmatic approach to crafting contractual provisions that specifically address the commercial and legal risks of AI development, commercialisation and use. There is nothing abstractly academic about that guidance.

The commercial risks associated with all transformative technology, including AI, include the risks that:

  • The technology doesn’t perform the way that the vendor promised it would
  • Due diligence is difficult to undertake on products and vendors that are new to the market
  • The solution’s operation may not be compatible, interoperable or easily integrated with legacy systems
  • The solution’s performance reliability is yet to be proven

To address those “new-to-market” risks, both the SCL and DTA recommend that AI contracts include terms that:

  • Define the operational and functional specifications of the solution in precise and empirically verifiable terms
  • Require either a vendor-led AI demonstration or an operational demonstration within the customer’s infrastructure
  • Require acceptance testing as a precondition to contract effectiveness and any licence fee payments
  • Stipulate a warranty (of reasonable duration) concerning the solution’s “on spec” operation and that requires expedited defect remediation

Where AI is offered as a service rather than as licensed software, the contract should also address the usual risks that are associated with:

  • The different variations of cloud or distributive computing
  • Any jurisdictional export control restrictions
  • Compliance with all privacy laws, including export restrictions
  • The service provider’s compliance with all applicable law, including outsourcing and cybersecurity regulations
  • Subcontracting restrictions
  • A prohibition on the re-export of data to other jurisdictions

Since many AI solutions are powerful search agents that function as scrapers and “crawler bots”, two of the most prominent and serious AI risks to contractually address are the misappropriation of personal (and often confidential) information that the AI solution accesses, views and copies or uses, and the unlicensed reprography and misappropriation of third-party intellectual property.

As intelligent as AI may appear, it may be unable to identify data and content that is the property of others.

Based on the AI copyright infringement cases that are now before the courts in the US and Britain, AI contracts should include broadly drafted third-party non-infringement covenants as well as indemnities to protect users from such third-party liability. That approach to manage the risk of intellectual property infringement is required for all content or data that AI finds, fetches and brings back to the doorstep.

More specifically, the SCL and DTA suggest that AI contracts include covenants to:

  • Ensure that the AI provides only original work
  • Ensure that AI does not merely customise, enhance or create derivative works of someone else’s property
  • Address whether the service vendor owns the AI or the AI otherwise relies on “open source” software
  • Provide that neither the use nor operation of the AI will breach any third-party rights, including any contractual, privacy, intellectual property or statutory rights

Next week, in part two, I will identify additional development and use risks that AI brings, and the contractual terms that are necessary to address those risks.

First Published in The Royal Gazette, Legally Speaking column, July 2025

Share
More publications
Appleby-Website-Insurance-and-Reinsurance
1 Apr 2026

Q1’26 Suggests Cat Bond Issuance Could Reach $20bn Again, Private ILS & Sidecar Surge to Continue

It’s been an exceptionally busy start to the year for the catastrophe bond sector, with Q1’26 officially becoming the second highest Q1 on record in terms of total catastrophe bond issuance, which indicates that 2026 could end up reaching the $20 billion+ milestone once again, Brad Adderley, Managing Partner at law firm Appleby has said.

Trust Disputes
27 Mar 2026

Privy Council decision in X Trusts – redefining the role of the protector

On 19 March 2026, the Judicial Committee of the Privy Council (JCPC) delivered its long-awaited judgment regarding the role of a fiduciary protector in the administration of a trust (A and 6 others (Appellants) v C and 13 others (Respondents) [2026] UKPC 11, on appeal from the Court of Appeal of Bermuda). The decision of the JCPC was unanimous, with the judgment being given by Lords Briggs and Richards.

Appleby-Website-Insurance-and-Reinsurance
26 Mar 2026

Latin American risks and the Bermuda market

Bermuda’s decades-long efforts to welcome Latin American risks to the island’s re/insurance market have borne fruit in the form of the many LatAm captive insurers that have become domiciled here.

Appleby-Website-Insurance-and-Reinsurance
24 Mar 2026

Navigating Bermuda’s New Recovery Planning Requirements: A Roadmap for Commercial Insurers

On 20 March 2026, the Bermuda Monetary Authority (BMA) issued an updated Guidance Note for Recovery Planning Requirements (Guidance Note). The Guidance Note assists Bermuda commercial insurers’ compliance with the obligations set out in the Insurance (Prudential Standards) (Recovery Plan) Rules 2024 (Rules), which became operative on 1 May 2025.

Appleby-Website-Private-Client-and-Trusts-Practice-1905px-x-1400px
13 Mar 2026

A will trust can keep a home in the family

In Bermuda, a family homestead represents more than financial value; it embodies ancestral heritage and housing security.

Appleby-Website-Employment-and-Immigration
12 Mar 2026

Privacy at Work: What PIPA Means for Bermuda Employers

The Personal Information Protection Act 2016 (PIPA), which came into force on 1 January 2025, represents Bermuda’s first comprehensive date protection regime. The legislation regulates the collection, use, disclosure and storage of personal information with the objective of protecting individuals’ privacy while allowing organisations to use data in a responsible and transparent manner. PIPA applies broadly to organisations operating in Bermuda, including employers. As a result, the employment relationship is one of the contexts in which the practical impact of PIPA is the most significant. Employers routinely process large volumes of personal information relating to employees and job applicants, and PIPA imposes obligations that affect recruitment, workplace monitoring, record-keeping, and disciplinary processes.

IWD website preview
9 Mar 2026

International Women’s Day 2026 Roundtable: Rights. Justice. Action. For all women and girls.

As we recognise International Women’s Day 2025, we are reminded that gender equality is not just a vision – it’s a call to action.

Dispute Resolution
4 Mar 2026

Bermuda: An Overview of Insurance: Contentious

There has been a recent increase in policyholder disputes involving coverage challenges by (re)insurers in the context of Bermuda high-value, excess-of-loss policies. This is, in part, due to Bermuda’s commercial (re)insurers facing a marked and sustained rise in the volume of claims, incurring claims costs globally of BMD1.1 trillion from 2016 through 2024. The massive volume and quantum of claims can be attributed in part to the significance of the Bermuda (re)insurance market in the global economy, as well as Bermuda’s exposure to catastrophic losses caused by natural disasters over this period. Bermuda’s increased exposure to global (re)insurance risks has naturally resulted in an increase in complex claims and coverage disputes.

Employment-and-Immigration
27 Feb 2026

Pay transparency heading Bermuda’s way?

The culture of secrecy with respect to pay traditionally found in workplaces may soon experience a shift, as global lawmakers and governments have enacted or moved toward enacting legislation to mandate greater pay transparency.