(Disponible en français)
Tribunals Ontario is committed to the fair, just and efficient resolution of proceedings before it. This Practice Direction provides guidance to participants about the use of artificial intelligence (AI)1 in tribunal proceedings.
The field of AI is evolving rapidly. Tribunals Ontario will continue to monitor its use and impact, and will adjust this Practice Direction as necessary.
Adjudication is a human responsibility. Tribunal members hear cases and make decisions based on the evidence and submissions provided by parties. They do not use AI to write decisions or analyze evidence. Tribunal members are fully accountable for their decision-making.
In some instances, AI may be a helpful tool for parties, but it is not perfect. If you rely on AI for research or to prepare documents for the Tribunal, you must do so carefully. Keep these key points in mind:
AI results can be wrong. If you use AI to find legal sources or analyze information, double-check the results carefully. Parties are responsible for the accuracy of any case law, articulations of legal principles, or evidence that is tendered.
AI might give you incorrect or made-up legal sources. Always verify the information by going directly to trusted sources, such as court or tribunal websites, official publishers, or recognized legal databases like CanLII for case law.
You are responsible for the accuracy of your written and oral submissions, even if AI helped prepare them. Always cross-check the information against reliable databases to ensure it is accurate and trustworthy. This protects the integrity of our justice system.