AI in the Courtroom: Should Lawyers Have to Declare its Use?

23 February 2026

The Civil Justice Council (CJC) — the advisory public body established under the Civil Procedure Act 1997 to oversee and coordinate the modernisation of the civil justice system in England and Wales — has published an important interim consultation paper examining whether new rules are needed to govern the use of AI by legal representatives in preparing court documents.

Why is this an Interim Report?

This report represents the CJC’s initial thinking rather than its final conclusions. The process has two stages: this interim report sets out preliminary proposals and asks specific consultation questions, inviting responses from legal professionals, judges, academics and others with an interest in civil justice. A final report will follow once the consultation period closes, incorporating the responses received before making definitive recommendations.

Why has the CJC issued this guidance?

AI is already transforming legal practice. The benefits are real and significant — it is improving efficiency, reducing costs, enhancing access to justice, and has already revolutionised the way lawyers approach research, data analysis and document preparation. The CJC is clear that it wants these benefits to continue and has no desire to restrict the use of AI in legal practice.

However, the risks are equally real. The CJC identifies several specific concerns that made this consultation necessary:

  • Hallucinations — AI systems can fabricate case citations, legislation and legal references that look genuine but do not exist. Presenting such material to a court would breach a lawyer’s professional duties and undermine the administration of justice.
  • Bias — AI tools based on Large Language Models inevitably reflect errors and biases in their training data, which lawyers may unwittingly allow to creep into court documents.
  • Deepfakes and manipulation — The increasing ability of AI to manipulate metadata and generate realistic but false text, images and videos is already posing challenges for the civil courts, particularly around forged evidence.
  • Unrepresented litigants — Litigants in Person are already using AI to prepare court documents, often without the skills to verify its accuracy or even the awareness that it can be wrong.
  • Witness evidence integrity — There is a real risk that AI could be used to alter, embellish or rephrase a witness’s evidence, undermining the fundamental principle that witness statements must be in the witness’s own words.

The key proposals are:

  • Statements of Case & Skeleton Arguments — No additional rules needed, provided the document bears the name of the legal representative taking professional responsibility for it. Accountability rests with the named professional, not the tool they used.
  • Witness Statements — A more restrictive approach is proposed. For trial witness statements (whether under PD57AC or CPR Part 32), legal representatives should be required to declare that AI has not been used to generate, alter, embellish, strengthen, dilute or rephrase the witness’s evidence. Witness statements must remain in the witness’s own words.
  • Expert Reports — Experts should be required to identify and explain any AI used in preparing their reports (beyond administrative uses such as transcription), including identifying the specific tools used.
  • Disclosure — No new rules proposed for disclosure lists at this stage, given parties are currently cooperating well in this area.

What the CJC is NOT proposing to regulate — Administrative uses of AI such as spell-checking, grammar, transcription, formatting and accessibility software require no declaration.

The overall message is pragmatic and balanced: AI is here, its use is growing, and it brings genuine benefits to the justice system. But accountability must remain firmly with the human legal professional, and public confidence in the courts must be protected.

What is Next?

The CJC is now inviting responses to its consultation questions from all interested parties — including legal professionals, judges, academics, litigants and legal tech specialists. Once the consultation period closes, the CJC will analyse the responses received and produce a final report setting out its definitive recommendations. Those recommendations could lead to changes to the Civil Procedure Rules and Practice Directions governing how court documents are prepared and verified. The CJC has also signalled that further work may follow on related areas not covered by this consultation, including the use of AI by unrepresented litigants and in wider areas of the justice system beyond civil courts. This is therefore just the beginning of what is likely to be an ongoing and evolving conversation about the role of AI in the administration of justice.

Read the full report here: https://www.judiciary.uk/wp-content/uploads/2026/02/Interim-Report-and-Consultation-Use-of-AI-for-Preparing-Court-Documents-2.pdf

Robert Foote
Partner - Corporate and Commercial Disputes & Restructuring and Insolvency
Robert Foote is a Partner Barrister at Spencer West. He specialises in Corporate and commercial disputes, director and shareholder disputes, asset tracing claims, insolvency disputes, funds disputes, trust and probate disputes, formal corporate restructurings, contentious mergers, mediations and arbitrations.