Fake it ‘til you… get referred to the SRA
On 6 June 2025, the President of the King's Bench Division of the High Court of Justice, Dame Victoria Sharp, delivered a ruling alongside Mr Justice Jeremy Johnson in the Divisional Court following the referral of two cases where lawyers were found to have submitted court documents riddled with fake legal authorities generated or likely to have been generated by AI tools such as ChatGPT. The ruling scrutinised the role of the lawyers in the cases of R (on the application of Ayinde) v The London Borough of Haringey (AC-2024-LON-003062) (“Ayinde”) and Al-Haroun v Qatar National Bank and Anor (CL-2024-000435) (“Al-Haroun”), and issued a stern warning to lawyers about the limits of AI, the weight of professional duties and the serious consequences of neglecting them.
Ayinde
Mr Ayinde brought judicial review proceedings against the London Borough of Haringey (the defendant) in respect of its failure to provide interim accommodation pending a statutory review of a decision that he did not have a priority need for housing. Haringey Law Centre acted for Mr Ayinde. The grounds for judicial review were settled by a junior barrister.
The submission misstated the effect of section 188(3) of the Housing Act 1996 and cited five fictitious authorities. The solicitor for the defendant raised concerns about the misstatement of the statute and the citations in pre-hearing correspondence but the claimant’s lawyers dismissed the concerns, describing them as “cosmetic errors”. The defendant proceeded to make a wasted costs application against the claimant which was heard in April 2025. The judge, Mr Justice Ritchie, found that the behaviour of the junior barrister and the Haringey Law Centre had been “improper and unreasonable and negligent”. The barrister denied using generative AI tools to prepare the submission.
The case of Ayinde, and Al-Haroun (see below), was referred to the Divisional Court under the court’s inherent jurisdiction to regulate its own procedures and to enforce duties that lawyers owe to the court. Dame Victoria Sharp rejected the barrister’s account of the events, concluding that there seem to be two possible scenarios, namely that: (1) the barrister deliberately included the fake citations (which would be a clear contempt of court); or (2) she did use generative AI tools to produce her submission (in which case her denial in a witness statement supported by a statement of truth would be untruthful and would amount to a contempt).
The Divisional Court ruling stopped short of initiating contempt proceedings for various reasons, including that the barrister is “an extremely junior lawyer”, she had already been criticised in a public judgment and she had already been referred to the Bar Standards Board (“BSB”) in respect of this matter, which will conduct an investigation. However, Dame Victoria Sharp made it clear that the decision not to initiate contempt proceedings is not a precedent and that “[l]awyers who do not comply with their professional obligations in this respect risk severe sanction”. She also referred the barrister to the BSB.
With regard to the solicitors, the court found that the paralegal had acted appropriately throughout and was not at fault, and that there was no reason to suspect that the solicitor had deliberately caused false material to be put before the court. However, as the defendant had put the solicitor on notice as to what had happened, Dame Victoria Sharp considered the steps he took in response were inadequate and referred him to the Solicitors Regulation Authority (“SRA”).
Al-Haroun
Mr Al-Haroun sought damages for alleged breaches of a financing agreement by the defendants, Qatar National Bank and QNB Capital. He was represented by Primus Solicitors.
In April 2025, Mrs Justice Dias extended the time for the defendants to file and serve evidence in relation applications to dispute the court’s jurisdiction and to strike out the claim or to enter summary judgment. Mr Al-Haroun applied to set aside that order, in support of which he provided a witness statement and relied on a witness statement from his solicitor. In May 2025, Mrs Justice Dias dismissed the application on the basis that the witness statements contained “numerous authorities, many of which appear to be either completely fictitious or which, if they exist at all, do not contain the passages supposedly quoted from them, or do not support the propositions for which they are cited”. One of the bogus authorities cited was even attributed to Mrs Justice Dias herself.
Mr Al-Haroun accepted responsibility for the inclusion of the fake authorities, admitting that they had been generated using publicly available AI tools, legal search engines and online sources. His solicitor accepted that he relied on Mr Al-Haroun’s legal research without independently verifying the authorities. He admitted what he had done was wrong and had referred himself to the SRA.
On referral to the Divisional Court, Dame Victoria Sharp found of Mr Hussain that there had been “a lamentable failure to comply with the basic requirement to check the accuracy of material that is put before the court” and that “[i]t is the lawyer’s professional responsibility to ensure the accuracy of such material”. Dame Victoria Sharp concluded that the threshold for contempt had not been met but, as with Ayinde, referred the solicitor to the SRA.
Use of AI in court proceedings
Dame Victoria Sharp acknowledged that AI is a powerful tool which can be useful in litigation which it carries both opportunities, for example to assist in the management of large disclosure exercises, and risks. Publicly available generative AI tools, trained on a large language model, such as ChatGPT, are “not capable of conducting reliable legal research” and such tools can produce “apparently coherent and plausible responses to prompts, but those coherent and plausible responses may turn out to be entirely incorrect”.
The ruling reinforced that lawyers who use AI tools to conduct legal research have a professional duty to check the accuracy of such research before using it in the course of their professional work. Further, there are serious implications for the administration of justice if AI is misused. If public confidence in the legal system is to be maintained, the use of AI tools must therefore be accompanied by “an appropriate degree of oversight, and within a regulatory framework that ensures compliance with well-established professional and ethical standards”.
Lawyers' duties
Reiterating the professional standards expected of solicitors and barristers alike, the court made clear that all legal representatives are responsible for the material they put before the court. This duty applies irrespective of how that material is generated. The Court highlighted the need for practical and effective measures to be taken by those with leadership responsibilities to ensure that all lawyers understand and comply with their professional and ethical obligations and their duties to the court if using AI.
Lawyers are under a strict obligation not to mislead the court, either knowingly or recklessly. The submission of documents citing fictitious authorities, even inadvertently, may constitute professional misconduct if it results from a failure to meet the required standard of diligence. The duty to act competently includes verifying that cited authorities exist, are correctly quoted and support the propositions advanced.
Consequences of falling short
The consequences for the lawyers in Ayinde and Al-Haroun were significant. In Ayinde, the solicitor and barrister were each required to pay the Council £2,000 in wasted costs, and the barrister was referred to the BSB. In Al-Haroun, the solicitor was referred to the SRA. The court also left no doubt as to the seriousness of their failure to meet their professional obligations.
While the court stopped short of initiating contempt proceedings in these cases, the court made clear that, where lawyers do not comply with their duties, the court’s powers include: “public admonition of the lawyer, the imposition of a costs order, the imposition of a wasted costs order, striking out a case, referral to a regulator, the initiation of contempt proceedings, and referral to the police”.
Comment
This ruling is a stark warning to lawyers. While generative AI tools promise efficiency and innovation, they cannot replace the rigour, integrity and diligence expected of legal professionals. Lawyers must treat AI not as a shortcut but as a tool which demands careful human oversight. As the court made clear, the legal system depends on the reliability of those who appear before it: if AI is to have a future in legal practice, it must be anchored in accountability. More information on the risks and opportunities of AI can be found in our Lexis Nexis article on the topic [1].
The ruling is R (on the application of Ayinde) v The London Borough of Haringey and Al-Haroun v Qatar National Bank [2025] EWHC 1383 (Admin) which can be read in full here.
Footnotes
[1] Lexis Nexis, AI in dispute resolution—balancing the risks and opportunities, 26 February 2025: https://www.lexisnexis.co.uk/legal/news/ai-in-dispute-resolution-balancing-the-risks-opportunities (paywall).