Should you use ChatGPT for legal advice? Rachel Pearce explains why artificial intelligence can’t compete with the real thing.
Artificial Intelligence (AI) is advancing and becoming more accessible all the time. You may have heard of ChatGPT, a natural language processing tool which is powered by AI. This tool took the internet by storm in recent months as people tested its impressive capabilities.
With its ability to save people time and fuel creative ideas, professionals within all sectors were keen to see what it could do for them—including the legal sector.
However, the dangers and limitations of using something like ChatGPT for legal advice or documents are already becoming apparent.
The benefits of ChatGPT and AI
The spike in popularity of ChatGPT just goes to show how advantageous it can be to use artificial intelligence. Whether you need it to generate blog ideas, interview questions or prompts, it can save time and stress.
When used correctly and to its full potential, the learning tool’s abilities are impressive. Generating content in seconds, it remembers a user’s previous queries allowing users to provide additional prompts for a better answer. With all of these qualities, it’s not hard to understand why people are keen to try it for themselves.
Although, is the AI, at times, too convincing? It can be hard to distinguish between what ChatGPT knows and what it thinks it knows. This was learnt the hard way in a court case where ChatGPT was used to supply ‘supporting evidence’.
A case of false citations
Earlier this year, a case in Manchester saw a Litigant in Person (LiP) presenting a false submission in court based on answers from ChatGPT. The civil case involved one represented party and one unrepresented party. When the first day of proceedings ended, the barrister argued there was no reason for the case to be advanced.
The following day, the LiP returned with four case citations that backed up the points they were trying to make. It didn’t take long for the Barrister to see that one of the cases wasn’t real. The other three were real cases but the cited passages didn’t match the final judgement of the case.
This case shows how ChatGPT can generate convincing content for users with no real or truthful substance attached. It also highlights the dangers of trusting it for legal advice or information of any kind.
The risks of using AI for law
The ChatGPT software does not purport to be an expert in law, or any other subject for that matter. It is, at its core, a language processing tool. When asking the language model: ‘What can a solicitor do that ChatGPT can’t?’ you’ll get an answer along the lines of the following:
‘As an AI language model, there are several things that a solicitor can do that I cannot do:
- Provide legal advice
- Represent clients in court
- Draft legal documents
- Handle legal negotiations
- Provide representation and advice on complex legal matters
It’s important to consult with a qualified solicitor for any legal matters, as they can provide tailored advice and support based on their expertise and understanding of the law.’
This disclaimer acts as a warning to consider how you use it and what you use the results for. There are several risks to consider before using artificial intelligence for any legal matters.
As the earlier mentioned court case proved, there is a level of inaccuracy within the ChatGPT software.
OpenAI, the company behind ChatGPT have also stated that the tool: “sometimes writes plausible-sounding but incorrect or nonsensical answers.” Therefore, the information could be partly accurate, or sound accurate, but ultimately be untrue.
The early model of ChatGPT did not have access to real-time information and used a dataset from 2021. This meant that the information was outdated and ran the risk of being inaccurate. As of May 2023, the company have announced that ChatGPT will have access to a search engine so the relevancy of information may improve.
Now, we must consider where the AI is retrieving the information from on the internet, what particular websites are being accessed and whether these are legitimate sources. Similarly, there is a risk that legal information is not applicable to UK law if it was sourced from a website outside of our jurisdiction.
Furthermore, as the model was initially built surrounding an input of other people’s works, you also have to consider that there could be a bias in its responses. AI tools have been found to be biased in the past, for example, racial discrimination in facial recognition technology. This is important to consider when we look at using ChatGPT and wider artificial intelligence in the legal context where the space needs to be free of any type of bias.
The ethics of using artificial intelligence within the legal space is within itself an ethical argument. It’s a hot topic of conversation as legal professionals discuss the advantages and disadvantages of doing so. Ironically, two of the biggest areas of ethics surrounding the use of AI are law related.
Copyright law should be one of the biggest conversations surrounding ethics and AI. ChatGPT scrapes information and delivers it to the user with no way to verify where that information has come from. ChatGPT itself has even been in legal trouble as part of a lawsuit involving unauthorised use of copyrighted material.
Data privacy plays another huge part in the ethics of using AI. As the tool collects private information and sensitive data, how this is stored and shared could have big implications. This was a particular point of contention in Italy, when their data regulator issued a temporary order demanding OpenAI to stop using the personal information of its citizens within its training data.
While recent updates to the model allow users to turn off chat history to disable their chats from training the model, users still need to subscribe to have more control over their data.
Avoiding the dangers
The simplest way to avoid the dangers of using AI for legal advice is to seek out a qualified and reputable law firm or legal professional. There is quite simply no substitute to replace legal advice from a qualified solicitor over technology.
Solicitors and lawyers are experienced, accredited and have the qualifications and expertise to advise and interpret the law for your particular circumstances on a range of matters. As different individuals specialise in particular areas of law, they will also have in-depth knowledge not available elsewhere.
Should you need a legal document drafted, such as the drawing up of tenancy agreements, or a will, for example, it’s advised to do so with the assistance of a solicitor. This is for peace of mind that everything will go as planned, that the information is all there, is up to date and correct and allows for any further amends as required.
Furthermore, discussing the more sensitive areas of the law such a clinical negligence and personal injury, technology can feel impersonal. Speaking to a solicitor allows you to converse with a friendly face, who can get to know you and your case going forward. Coodes can also refer clients to support agencies and other external providers should more support be required—something that ChatGPT can’t do.
Coodes Solicitors have been helping the local community since its inception over 275 years ago. As one of the longest established firms in the South West region, we provide legal advice across the entire spectrum of business and personal needs.
Our services are delivered by highly experienced lawyers who are open, honest, and direct. Don’t take the chance with ChatGPT – get in touch today.
For more information on clinical negligence or personal injury claims, please contact Rachel Pearce from Coodes Solicitors’ Clinical Negligence and Personal Injury Team at 01326 214020 or via email.