
High Court Justice Victoria Sharp must have had a very strange day. While reviewing case citations presented by lawyers, she realized something shocking: many of them simply didn’t exist.
It seems the lawyers had turned to generative AI for research, and, as often happens, the AI hallucinated and fabricated case law. The High Court, which oversees high-profile civil cases in the UK, warned that citing non-existent cases could lead to contempt of court or even criminal charges. But that’s unlikely to stop AI’s growing presence in legal practice.
Phantom cases
It’s easy to point fingers at students using AI to cheat on homework — but this trend has already seeped into professional legal circles. In a UK tax tribunal in 2023, an appellant provided nine bogus cases as “precedents”. When it was revealed that the cases weren’t real, she admitted it was “possible” that she used ChatGPT. In another 2023 case in New York, a court descended into chaos when a lawyer was challenged to produce the fictitious cases he cited.
Fast-forward to a recent £90 million ($120 million) lawsuit involving Qatar National Bank, where 45 case-law citations were made and 18 out of them turned out to be fake. Several others featured fabricated quotes. The claimant admitted to using publicly available AI tools for legal drafting.
In yet another case, a legal center challenged a London borough over its failure to provide temporary housing. The lawyer cited phantom cases five times and couldn’t explain why no one could find them. While he didn’t admit to using AI, he claimed they may have appeared via Google or Safari searches, both of which now feature AI-generated content.
Needless to say, the judges were not amused.
“There are serious implications for the administration of justice and public confidence in the justice system if artificial intelligence is misused,” Judge Victoria Sharp said in a written ruling.
Sharp admitted that AI has great potential that can be useful in the court of law, but emphasized that there is a great need for tight oversight.
“Artificial intelligence is a powerful technology. It can be a useful tool in litigation, both civil and criminal,” the ruling reads. “This comes with an important proviso, however. Artificial intelligence is a tool that carries with it risks as well as opportunities. Its use must take place, therefore, with an appropriate degree of oversight, and within a regulatory framework that ensures compliance with well-established professional and ethical standards if public confidence in the administration of justice is to be maintained.”
“In those circumstances, practical and effective measures must now be taken by those within the legal profession with individual leadership responsibilities … and by those with the responsibility for regulating the provision of legal services.”
What happens now?
Sharp issued a “regulatory ruling” regarding the use of AI. While it is not legislation (ie a new law passed by Parliament), a regulatory ruling carries authoritative weight within the legal profession and sets up a framework on how to deal with various situations. If lawyers continue to quote hallucinated cases, consequences could include public reprimands, cost penalties, wasted cost orders, contempt proceedings, and even police referrals.
“Where those duties are not complied with, the court’s powers include public admonition of the lawyer, the imposition of a costs order, the imposition of a wasted costs order, striking out a case, referral to a regulator, the initiation of contempt proceedings, and referral to the police.”
But like so many tech-related issues, this is yet another case of technology moving faster than the rules. AI tools like ChatGPT are already being used to draft legal arguments, but there’s no global professional standard for how to use them safely and responsibly.
AI is evolving fast, but the rules, education, and oversight needed to use it responsibly are lagging behind. In this case, it was a blatant misuse, but subtler hallucinations may already be going undetected. If accuracy and factuality can’t even be enforced in the court of law, this doesn’t bode well for the rest of society.