AI is no silver bullet for contract drafting

Bill Barton is a director at Barton Legal

There has been no shortage of excitement about the potential applications of artificial intelligence (AI) across the legal sector. Some have even foreseen a future where AI sits as a judge, at least in the more straightforward cases. Already, a number of law firms are using AI for some day-to-day work. The practice is now sufficiently widespread that the Solicitors Regulation Authority felt compelled to release a report in November last year looking at the potential benefits and risks of law firms using AI.

“Any legal document stands or falls on the basis of precision – which was sadly lacking from the AI’s efforts”

One of the areas where AI is expected to have most application is in drafting the contracts that underpin virtually every legal agreement in the UK.

The potential appeal of AI for a lawyer working in an industry as reliant on contracts as construction speaks for itself. But as we discovered when attempting to use AI to draft contracts ourselves, the process is far from straightforward. Not only does AI fail to increase efficiency – with the AI requiring just about as much direction as a very junior trainee – but the drafting process also raised a number of other concerns.

Our AI contracts

To gain a better sense of what the technology can be used for when drafting a contract, we wanted to assess what AI was capable of producing at the first time of asking.

We began with a direction as simple as “draft a construction contract”, but quickly realised we would need to make more nuanced requests. With this in mind, we asked it to “draft a lease” and “draft a construction contract for groundworks”.

The AI was reasonably more successful in drafting a lease. This was not a surprise as there is more information regarding leases in the public domain. However, even with a simple search on Google you would find a more complete draft lease.

Our intention, honed through trial and error, was to work out what we needed to ask the AI to refine the approach. So we asked it to redraft the groundworks contract in a way that favoured the client.

Eventually, satisfied by our progress at this stage, we then asked it to produce a full contract, using the prompt: “Draft a full-length groundworks construction contract which favours the employer dated 19/03/2024.”

What the AI provided was ultimately very superficial, but as a training exercise it did give us enough to understand where AI was struggling and, to an extent, how we could most effectively direct it to improve its performance. Then, using what we had learned, we concluded our experiment with AI by asking it to produce a construction-specific contract clause, requesting the AI to “add in a provisional sum to the above contract”.

Where the AI went wrong

One of the immediate issues with the AI-drafted contracts was as simple as the language they were written in. Although they were all produced in English, the tone, spelling and language was very American – which is far from ideal for a practitioner of English law. Indeed, while it may seem like a relatively minor mistake, any legal document stands or falls on the basis of precision – which was sadly lacking from the AI’s efforts.

That speaks to a wider concern with using AI to draft contracts. When drafting a contract, the starting point is typically a base document that we have written, or we may draft more complex contracts entirely from scratch. We’re able to rely on these documents because we can trust in our own drafting. With an AI contract, however, that trust just cannot be assumed.

This means that instead of offering a more efficient alternative, it’s necessary to check and consider every line carefully and in detail to ensure that the AI has produced what you asked it to. That process is not as simple as asking an AI to draft a contract for ‘x’ purpose. Instead, a series of prompts need to be tailored to train the AI.

This may in the long run result in a more efficient drafting process as the AI learns from past errors, but without knowing how long this training period would be, it’s hard to argue for its efficiency with any conviction.

Broader concerns

Contracts may have a ‘standard’ form in many instances, but the division of responsibility between parties and their liability for anything that may go wrong must still always be carefully considered. Otherwise, the result won’t be a fair contract – an issue that has potentially serious legal consequences in the event of a dispute.

As a result, even if AI can produce a ‘word-perfect’ contract, if it doesn’t properly allocate risk and liability, the clauses may as well be riddled with spelling errors.

What the exercise shows is that AI can produce something that to a layperson may look either legal or impressive, or both. However, what is most striking to a solicitor is not what is there, but how much is missing.

There are also very real concerns that, as the use of AI becomes more widespread, the understanding of what a contract actually entails will decrease.

As demonstrated by the Triple Point case, too many contracts are already signed without being properly understood by both parties.

A good lawyer, when either drafting or reviewing a contract, should be able to explain to their client what the contract actually entails in layperson’s terms. But a contract produced by AI, with only minimal supervision by a lawyer – if that – significantly increases the risks of being agreed to without first being fully understood.

AI is going to be used with increasing regularity, across both the legal sector and the wider world of business, in a host of different ways. That may well, eventually, include drafting contracts.

However, what our experiment has shown, is that we are still some way off being able to use AI to generate a contract that can actually be relied on by clients.

And a contract that cannot be relied on is not worth the paper it’s written on.

Source link

About The Author