By Maxime Gagné — Justice-Quebec.ca · April 22, 2026
On April 22, 2026, the Quebec Superior Court rendered a landmark decision on artificial intelligence and justice.
In ARIHQ v. Santé Québec, 2026 QCCS 1360, Justice Martin F. Sheehan set aside an arbitral award whose entire body of doctrinal and case-law references turned out to be hallucinations produced by a generative AI tool.
This decision establishes the first Quebec benchmarks governing the use of artificial intelligence by arbitrators and other adjudicative decision-makers, while reaffirming the fundamental principle that an arbitrator cannot delegate decision-making authority — not even to an algorithm.
Association des ressources intermédiaires d’hébergement du Québec (ARIHQ) v. Santé Québec — CIUSSS du Centre-Sud-de-l’Île-de-Montréal, 2026 QCCS 1360
Quebec Superior Court · Commercial Division
Decision rendered: April 22, 2026 · Hearing: March 17 and 18, 2026
Presiding: The Honourable Martin F. Sheehan, J.S.C.
The Facts of the Case
The story begins with a fairly ordinary dispute between a healthcare provider and Quebec’s public health network. Centre de Santé Osman, an intermediate residential care resource, was claiming approximately $1,225,000 from the CIUSSS du Centre-Sud-de-l’Île-de-Montréal (now Santé Québec) for residential placements declared available between 2019 and 2022, for which no users had been referred.
The CIUSSS refused to pay. The institution’s position, communicated in November 2021, was « clear and final ». However, the Association des ressources intermédiaires d’hébergement du Québec (ARIHQ) did not transmit its notice of disagreement until nearly three years later, on August 1, 2024 — whereas the National Agreement binding the parties provides for a strict 90-day deadline.
CIUSSS’s position: The notice of disagreement was filed well past the deadline. The process was long dead.
ARIHQ’s and Osman’s position: The 90-day contractual deadline is contrary to public order, since article 2884 of the Civil Code of Québec prohibits modifying limitation periods by contract — and the general limitation period is three years.
On August 8, 2025, arbitrator Me Michel A. Jeanniot rendered an award rejecting the dispute as being out of time. The award concluded that the deadline clause was valid and binding on the parties. As to the result, the decision was not particularly surprising.
It was upon reading the reasons that the Applicants made a troubling discovery.
The Discovery: References That Don’t Exist
The Applicants’ lawyers did what any rigorous practitioner would do after reading a decision: they tried to consult the authorities cited by the arbitrator. And there, surprise.
In paragraph 71 of his award, the arbitrator cited a certain Frédéric Bachand for an article titled « Prescription et déchéance : frontières mouvantes et enjeux pratiques », supposedly published in Développements récents en droit des contrats by the Bar of Québec in 2016. This article does not exist.
Further on, in paragraphs 84 and following of his award, the arbitrator relied on three decisions to support his main reasoning:
1. Ville de Montréal v. Syndicat des cols bleus regroupés de Montréal (SCFP, section locale 301), 2005 QCCA 591 — this decision does not exist. The neutral citation leads to an entirely different judgment.
2. Groleau et Groupe Pages Jaunes Cie, 2011 QCCS 5386 — also non-existent. The neutral citation likewise leads elsewhere.
3. Tremblay v. Commission scolaire de la Jonquière, 2002 CanLII 24357 (QCCA) — equally non-existent. Same pattern: a neutral citation pointing to another case altogether.
And it continues. In paragraph 105 of his award, the arbitrator invoked an « arbitral instance » titled Arbitrage CHU Ste-Justine (D.T.E. 2018-30) to support the validity of a 30-day contractual deadline. SOQUIJ confirmed that this decision does not exist. The neutral citation closely resembles a real decision (Sonin v. Université Concordia), but that case has absolutely nothing to do with the proposition for which it was cited.
Justice Sheehan made the essential observation: these phantom references lay at the very heart of the arbitrator’s reasoning. They constitute the only doctrinal and jurisprudential authorities relied upon as legal support for the award. The other citations appear only in sections summarizing the parties’ positions.
« The preponderant evidence therefore leads to the conclusion that the Arbitrator’s authority was delegated and that he abdicated his role of reviewing the result. »
— Justice Martin F. Sheehan, J.S.C., paragraph 114How to Set Aside an Arbitral Award in Quebec
To fully appreciate the scope of this decision, one must understand that an arbitral award is, in principle, final and without appeal. This is one of arbitration’s defining features: by consenting to it, parties agree to forgo the usual judicial recourses.
The only way to challenge an award is through the application for annulment under article 648 of the Code of Civil Procedure. And the grounds for annulment are exhaustive: the court cannot review the merits of the dispute, nor pronounce on the soundness of the reasoning, nor even on the reasonableness of the result.
An arbitral award may only be set aside if one of the following conditions is met: incapacity or invalidity of the arbitration agreement; failure to follow the method for appointing the arbitrator or the applicable arbitration procedure; inability of a party to assert its case; award dealing with a dispute not contemplated by the agreement; or award contrary to public order.
The Applicants invoked two grounds: the award would be contrary to public order (because it upholds a contractual deadline shorter than the legal limitation period), and the arbitration procedure had not been respected (because the arbitrator had allegedly delegated his function to artificial intelligence).
Justice Sheehan dismissed the first argument. He reaffirmed a well-established principle: even an error of law, even on a public-order provision, is not a ground for annulment. The court can intervene only if the result of the award (and not its reasoning) offends fundamental principles of Quebec public order. The Court of Appeal has already confirmed that contractual notice deadlines like the one in the National Agreement are valid.
It is on the second argument — the failure to respect the arbitration procedure — that the decision turns.
The Arbitrator Chosen by the Parties Must Draft the Award Personally
The core of Justice Sheehan’s reasoning rests on an ancient principle of administrative and arbitration law: delegatus non potest delegare — one who receives a power cannot delegate it.
In arbitration, this principle takes on particular resonance. The parties choose their arbitrator. They select that person based on expertise, experience, availability, and sometimes reputation. In the case at bar, the choice had to be made from a limited list of ten arbitrators predesignated in the National Agreement. This choice is not incidental: it is the cornerstone of contractual arbitration.
Justice Sheehan articulates three pillars that lead to the same conclusion:
The autonomy of the parties’ will. The parties chose a specific arbitrator. They are entitled to expect that this arbitrator — and not a third party, much less an algorithm — will render the decision.
The importance of written reasons. As the Supreme Court emphasized in Baker, « reasons foster better decision making by ensuring that issues and reasoning are well articulated and, therefore, more carefully thought out ». The drafting process is itself an instrument of reflection.
The secrecy of deliberation. Article 644 of the C.C.P. imposes on the arbitrator a duty to preserve the confidentiality of his deliberations. Communicating the details of a case to an external AI tool constitutes, in itself, a potential breach of this duty.
The judge nevertheless takes care to specify that this rule does not prohibit all use of tools. The arbitrator may consult researchers, clerks, translation or citation aids. He may even consult colleagues. But the responsibility for drafting must remain with the decision-maker, and third-party participation must not undermine the integrity of the process.
Artificial Intelligence in the Justice System: A Revolution Under Scrutiny
Justice Sheehan does not demonize artificial intelligence. He even cites with approval Justice Morin, who recently observed in Specter Aviation Limited v. Laprade that « any technological measure capable of fostering citizens’ access to the justice system should be welcomed and regulated rather than proscribed and stigmatized ».
Indeed, a growing number of legal professionals use Large Language Models (LLMs) to summarize documents, identify relevant materials within gigabytes of data, transcribe audio files, or refine the drafting of existing texts. These uses are legitimate.
But the use of AI in a judicial context presents specific risks that Justice Sheehan details with precision:
1. Hallucinations. LLMs regularly create false legal references that appear perfectly authentic. They can only be detected after careful examination.
2. The absence of human discretion. Algorithms cannot adequately integrate community values, the subjective characteristics of the parties, or particular contextual circumstances.
3. Bias. AI reproduces the biases present in its training data, and the companies that develop these tools often refuse to disclose their source code (the « black box » phenomenon).
4. Lack of confidentiality. Information transmitted to an AI tool is integrated into its database and may later be disclosed — a major problem for the secrecy of deliberation.
5. Public confidence. If litigants believe a machine has decided their case in place of a human being, confidence in the system may be shaken.
Justice Sheehan also cites the Canadian Judicial Council Guidelines, published in September 2024, which are unequivocal: no judge is authorized to delegate decision-making authority, whether to a judicial assistant, an administrative assistant, or a computer program, regardless of its capabilities.
« Judges are exclusively responsible for the judicial decisions they render. »
— Canadian Judicial Council, Guidelines, September 2024This rule, which applies to judges, logically extends to arbitrators. Justice Sheehan confirms it: the same principles apply to any decision-maker entrusted with an adjudicative function.
A Rule of Proportionality, Not an Absolute Prohibition
The judgment carefully avoids laying down an absolute rule. Justice Sheehan specifies that not every award containing erroneous references or involving the use of AI should necessarily be set aside.
The analysis must be contextual. The court must weigh:
(1) the nature of the breach in light of the arbitration procedure engaged; (2) the seriousness of the impact on the integrity of the procedure; (3) the actual impact of the breach on the award itself.
A minimal use of AI on a peripheral issue would likely not result in annulment. Similarly, when the standard of review is different — for example on appeal or judicial review — a reasonable decision could survive a procedural breach.
But in the case at bar, the breach was massive. The entire body of doctrinal and case-law references serving as the foundation of the award was fabricated. The breach struck at the very integrity of the reasoning. It was likely to affect the parties’ confidence not only in the award itself, but in the arbitration regime as a whole.
As the judge notes, « a party may reasonably believe that a more thorough verification of the decisions [by the arbitrator] would have prompted him to reconsider his position ». In other words: the breach probably had a determinative impact on the result.
What This Decision Changes for Legal Practice
The judgment ARIHQ v. Santé Québec belongs to a recent wave of Canadian and foreign decisions sanctioning the inappropriate use of artificial intelligence in the justice system. But the great majority of these cases involved lawyers or self-represented litigants who had filed pleadings containing hallucinations.
The originality of the Sheehan decision lies in the fact that it targets a decision-maker. And it lays down a principle that extends well beyond the case at bar.
For arbitrators, administrative judges, and other adjudicators, the message is clear: the tool may assist, but it can never replace the deliberative function. An arbitrator who asks ChatGPT (or any other LLM) to draft his reasons, and who simply copies them without verification, abdicates his function.
For parties to an arbitration, the decision opens a new avenue of analysis. Before treating an award as « final and without appeal », it becomes prudent to systematically verify the cited authorities. A single fabricated citation will not necessarily lead to annulment, but a saturation of phantom references may suffice to overturn the presumption.
For lawyers and counsel, the lesson is more nuanced. AI remains a valuable tool. The Superior Court expressly recognizes this. But the obligation of human verification — what the Quebec Superior Court’s October 2023 notice called « rigorous human oversight » — remains non-negotiable.
A Decision That Draws a Clear Line
The judgment in ARIHQ v. Santé Québec does not revolutionize Quebec law. It applies long-standing principles — respect for the parties’ will, secrecy of deliberation, prohibition on delegating decision-making authority — to a new phenomenon: the possibility, for a hurried or careless decision-maker, of entrusting his intellectual task to a machine.
Justice Sheehan does not fall into the all-or-nothing trap. He rejects both the demonization and the trivialization of AI. Instead, he proposes an adapted analytical framework: what matters is not the tool, but the human responsibility behind the decision.
At a time when generative tools are becoming ubiquitous in legal practice, this judgment sets an essential benchmark. It reminds us that the legitimacy of the justice system rests, in the final analysis, on a simple fact: behind every decision, there must be a person who has actually thought, grasped the issues, and assumed the responsibility of deciding.
Editorial note: This decision is subject to appeal. Justice-Quebec.ca will follow the case’s evolution and publish an update should it be brought before the Court of Appeal.
For legal professionals who use generative AI tools in their practice, the Quebec Superior Court published a notice to the legal community on October 24, 2023, regarding the integrity of submissions to the courts. The Canadian Judicial Council, for its part, published its Guidelines for the Use of AI in Canadian Courts in September 2024.
The information presented here is for informational purposes only. Justice-Quebec.ca does not provide legal advice. For any personal question, please consult a member of the Bar of Québec.
Information as a lever. Artificial intelligence as the great equalizer.
Together, we go further.
Related articles
- Official decision ARIHQ v. Santé Québec, 2026 QCCS 1360 — SOQUIJ
- Guidelines Guidelines for the Use of AI in Canadian Courts — CJC
- Official notice Quebec Superior Court Notice on AI — October 24, 2023
- Practical guides Practical Legal Guides — Justice-Quebec.ca
Sources
Decision under commentary
- Association des ressources intermédiaires d’hébergement du Québec (ARIHQ) v. Santé Québec — Centre intégré universitaire de santé et de services sociaux du Centre-Sud-de-l’Île-de-Montréal, 2026 QCCS 1360 (Sheehan J.), SOQUIJ.
Cases cited
- Baker v. Canada (Minister of Citizenship and Immigration), [1999] 2 S.C.R. 817, SCC.
- Northwestern Utilities Ltd. v. Edmonton (City), [1979] 1 S.C.R. 684, SCC.
- Therrien (Re), 2001 SCC 35, SCC.
- Desputeaux v. Éditions Chouette (1987) inc., 2003 SCC 17, SCC.
- Dell Computer Corp. v. Union des consommateurs, 2007 SCC 34, SCC.
- I.W.A. v. Consolidated-Bathurst Packaging Ltd., [1990] 1 S.C.R. 282, SCC.
- Construction Infrabec inc. v. Paul Savard, Entrepreneur électricien inc., 2012 QCCA 2304, CanLII.
- Specter Aviation Limited v. Laprade, 2025 QCCS 3521 (Morin J.), CanLII.
- Ko v. Li, 2025 ONSC 2965, CanLII.
- Zhang v. Chen, 2024 BCSC 285, CanLII.
- Mata v. Avianca, Inc., 678 F.Supp. 3d 443 (S.D.N.Y. 2023).
Doctrine and guidelines
- Canadian Judicial Council, Guidelines for the Use of Artificial Intelligence in Canadian Courts, 1st ed., September 2024, Official PDF.
- Quebec Superior Court, Notice to the Legal Community and the Public — Integrity of Submissions to the Courts in Cases Involving the Use of Large Language Models, October 24, 2023, Official PDF.
- Action Committee on Court Operations in Response to COVID-19, Demystifying Artificial Intelligence in Judicial Processes, FJA.
- Patrick FERLAND, « Homologation et annulation des sentences arbitrales » in LegisPratique — Guide de l’arbitrage, 2nd ed., Montréal, LexisNexis, 2025.
- Judith GUÉRIN and Émilie CHEVRIE, « Jurisprudence récente en matière d’intelligence artificielle générative », Bulletin Praeventio, Quebec Bar Professional Liability Insurance Fund, October 1, 2025.
- Richard RE, « Artificial Authorship and Judicial Opinions », (2024) 92 George Washington Law Review 1558.
- Gary E. MARCHANT, « AI in Robes: Courts, Judges, and Artificial Intelligence », (2024) 50:3 Ohio Northern University Law Review, art. 2.
Statutes
- Code of Civil Procedure, CQLR, c. C-25.01, arts. 1, 624, 644, 645, 646, 648, LegisQuébec.
- Civil Code of Québec, CQLR, c. CCQ-1991, arts. 2639, 2884, 2925, LegisQuébec.
- Act respecting the governance of the health and social services system, CQLR, c. G-1.021, arts. 536, 538, 541, 542, LegisQuébec.
Ajouter un commentaire
Commentaires