In a judgment delivered by Costello P. on 26 March 2026, in the case of James Guerin v Gemma O’Doherty [2026] IECA 48, the Court of Appeal addressed, for the first time, the appropriate use of artificial intelligence (“AI”) in the preparation of legal submissions. The Court used this opportunity to set out guidelines governing the use of AI tools in the preparation of legal submissions.
The appeal concerned the Defendant’s challenge to a High Court decision refusing her application to have the defamation proceedings against her struck out. In the course of the appeal, Costello P. remarked that the Defendant, Ms. O’Doherty, a lay litigant, had used AI to compile her submissions.
The submissions brought before Costello P. were found to include references to authorities which, upon scrutiny, “simply did not exist”, a phenomenon more commonly referred to as “hallucinations”. This is an inherent and well-documented risk associated with using AI. Further, it was evident that Ms O’Doherty had not verified the existence of the authorities she cited, nor were the “cases” relied upon supportive of the propositions she advanced.
It was further noted by the Court that Ms. O’Doherty did not inform the solicitors for the Plaintiff or the Court that she had prepared her submissions with the aid of AI. It was outlined by Costello P., that “Parties, whether represented or not, have an obligation not to mislead the court, which includes the obligation not to rely upon, or advance submissions based upon ‘fake’ authorities or propositions which have no basis in law.”
In light of her concerns, Costello P. expressed that parties, including lay litigants, should use AI appropriately and should be given guidance as to how such programmes may be used to assist in litigation. She went on to set out the following principles of general application:
Costello P. concluded the judgment with a warning that irresponsible use of AI risks bringing the administration of justice into disrepute and, further, of actively misleading the court.
This judgment represents a timely and significant intervention by the Court of Appeal at a moment when AI tools are becoming increasingly accessible to legal practitioners and lay litigants alike. In setting out clear expectations, around the responsibilities which arise where individuals use AI, the Court has emphasised that while such tools may serve as a valuable aid, they cannot replace the obligation placed upon litigants, both represented and otherwise, to ensure their submissions to the Court are accurate. It is anticipated that this decision will act as a benchmark for future judicial consideration of the role of AI in litigation, as the legal system continues to adapt to transformative technologies.
The judgment can be read in full here.
For further information please contact Gavin Simons (Partner) or your usual AMOSS contact.