header-logo header-logo

Hazards of gen AI & fictitious cases

11 June 2025
Issue: 8120 / Categories: Legal News , Profession , Artificial intelligence , Technology
printer mail-detail
Heads of chambers and law firm partners must take ‘practical and effective measures’ to ensure every individual understands their duties if using artificial intelligence (AI), the High Court has said

Handing down judgment in R (Ayinde) v London Borough of Haringey [2025] EWHC 1383 (Admin) last week, Dame Victoria Sharp and Mr Justice Johnson held that two lawyers who cited fictitious cases in separate court proceedings should not face contempt proceedings.

However, they emphasised that in future hearings ‘such as these, the profession can expect the court to inquire whether those leadership responsibilities have been fulfilled’.

They set out the range of sanctions for submitting false material— ‘costs order, the imposition of a wasted costs order, striking out a case, referral to a regulator, the initiation of contempt proceedings, and referral to the police’.

In the first case, Sarah Forey, a pupil barrister, instructed by Haringey Law Centre, cited fictitious cases during a judicial review. There is no suggestion she intended to use AI or knew the cases were fake. Forey said, when drafting the grounds, she may have carried out additional Google or Safari searches without realising they included AI-generated summaries.

Emily Carter and Sahil Kher, Kingsley Napley, acting pro bono for Haringey Law Centre, said their clients ‘fully understand the seriousness of the issues that have arisen, and made full and unconditional apologies to the court.

‘They are reassured that the court has found there was no basis to suggest that the Law Centre or its senior solicitor had deliberately caused false material to be put before the court. The Law Centre paralegal—referred to as a solicitor in the original judgment—was found to be “not at fault in any way”.’

In the second case, Abid Hussain of Primus Solicitors admitted relying on legal research conducted by his own client, Mr Al-Haroun, which included 18 fake cases, in an £89.4m claim against Qatar National Bank and another. Hussain apologised and referred himself to the regulator.

Ian Jeffery, CEO of the Law Society, said: ‘Whether generative AI, online search or other tools are used, lawyers are ultimately responsible for the legal advice they provide.’

MOVERS & SHAKERS

Birketts—trainee cohort

Birketts—trainee cohort

Firm welcomes new cohort of 29 trainee solicitors for 2025

Keoghs—four appointments

Keoghs—four appointments

Four partner hires expand legal expertise in Scotland and Northern Ireland

Brabners—Ben Lamb

Brabners—Ben Lamb

Real estate team in Yorkshire welcomes new partner

NEWS
Robert Taylor of 360 Law Services warns in this week's NLJ that adoption of artificial intelligence (AI) risks entrenching disadvantage for SME law firms, unless tools are tailored to their needs
From oligarchs to cosmetic clinics, strategic lawsuits against public participation (SLAPPs) target journalists, activists and ordinary citizens with intimidating legal tactics. Writing in NLJ this week, Sadie Whittam of Lancaster University explores the weaponisation of litigation to silence critics
Delays and dysfunction continue to mount in the county court, as revealed in a scathing Justice Committee report and under discussion this week by NLJ columnist Professor Dominic Regan of City Law School. Bulk claims—especially from private parking firms—are overwhelming the system, with 8,000 cases filed weekly
Writing in NLJ this week, Thomas Rothwell and Kavish Shah of Falcon Chambers unpack the surprise inclusion of a ban on upwards-only rent reviews in the English Devolution and Community Empowerment Bill
Charles Pigott of Mills & Reeve charts the turbulent progress of the Employment Rights Bill through the House of Lords, in this week's NLJ
back-to-top-scroll