AI False Citations “Hallucinations,” and Legal Risk: What Every Lawyer Needs to Know About Verification and Reliability

Ali Mohammed
Ali Mohammed
JuristAI Legal Technology Group, Inc

Ali Mohammed is the CEO and a co-founder of JuristAI. He has always had a strong sense of justice and a passion for the law. Ali volunteered as an intern at a criminal justice firm during college.

Mark Underwood
Mark Underwood
JuristAI Legal Technology Group, Inc

Mark Underwood is the Co-Founder of JuristAI and creator of Lawvocate, an AI legal agent he trained that runs as a private GPT on OpenAI GPT-5 Thinking and is rolling out via the OpenAI API.

Live Video-Broadcast: December 12, 2025

2 hour CLE

This program is only available to All-Access Pass Members.
Subscribe to Miami-Dade Bar + myLawCLEs All-Access Pass...
Get this course, plus over 1,000+ live webinars.
Learn More

Program Summary

This program will explore why current AI tools can sometimes prove unreliable for legal applications. Specifically, it will cover how AI can invent false case citations or statutory references, known as "hallucinations."

We will discuss why AI hallucinations occur and examine real-world examples from the AI Hallucination Cases Database, highlighting the risks they pose in legal practice. It will discuss when AI systems can be trusted and when human oversight remains essential.

Attendees will learn how JuristAI eliminates hallucinations from its outputs through rigorous validation methods. Attendees will also gain practical guidance on when and how to build a manual verification workflow, involving incorporating source-of-truth retrieval, citation checking, and human-in-the-loop review, especially for those not using automated hallucination detection tools.

Key topics to be discussed:

  • What is an AI hallucination?
  • Why do hallucinations occur?
  • When are users most at risk of hallucinations occurring?
  • What kind of AI produces hallucinations?
  • The categories of hallucinations (fake cites vs. pincites vs. fake quotes vs. irrelevance)
  • Consequences of submitting AI hallucinations in court (Rule 11)
  • The AI Hallucination Cases Database (https://www.damiencharlotin.com/hallucinations/)
  • When AI can be trusted and when it cannot
  • Is this problem unfixable?
  • Is it worth using AI if this is a risk?
  • Automated post-processing of AI output
  • Manual post-processing of AI outputs
  • How JuristAI pre-empts and removes hallucinations from our work outputs
  • How lawyers can build a manual verification workflow (source-of-truth retrieval, citation validation, and human-in-the-loop review) if they aren't using software hallucination detection

This course is co-sponsored with myLawCLE.

Date / Time: December 12, 2025

  • 1:00 pm – 3:10 pm Eastern
  • 12:00 pm – 2:10 pm Central
  • 11:00 am – 1:10 pm Mountain
  • 10:00 am – 12:10 pm Pacific

Closed-captioning available

Speakers

Ali Mohammed | JuristAI Legal Technology Group, Inc

Ali Mohammed is the CEO and a co-founder of JuristAI. He has always had a strong sense of justice and a passion for the law. Ali volunteered as an intern at a criminal justice firm during college. He went on to achieve a master’s in IT Management and became a solution architect.

 

Mark Underwood | JuristAI Legal Technology Group, Inc

Mark Underwood is the Co-Founder of JuristAI and creator of Lawvocate, an AI legal agent he trained that runs as a private GPT on OpenAI GPT-5 Thinking and is rolling out via the OpenAI API. A Fractional Chief Legal Officer and the Legal Community Leader at Startups.com, Mark advises founders and legal teams on AI governance, data, IP, and contract strategy. He has 37+ years of experience aligning product velocity with legal defensibility and is admitted to practice before the U.S. District Court for the Western District of Michigan.

Mark’s practical frameworks and commentary on law and emerging tech routinely reach thousands of practitioners and operators across the startup ecosystem.

Agenda

I. What is an AI hallucination? | 1:00pm – 1:10pm

II. Why do hallucinations occur? | 1:10pm – 1:20pm

III. When are users most at risk of hallucinations occurring? | 1:20pm – 1:30pm

IV. What kind of AI produces hallucinations? | 1:30pm – 1:40pm

V. The categories of hallucinations (fake cites vs. pincites vs. fake quotes vs. irrelevance) | 1:40pm – 1:50pm

VI. Consequences of submitting AI hallucinations in court (Rule 11) | 1:50pm – 1:55pm

VII. The AI Hallucination Cases Database (https://www.damiencharlotin.com/hallucinations/) | 1:55pm – 2:00pm

Break | 2:00pm – 2:10pm

VIII. When AI can be trusted and when it cannot | 2:10pm – 2:15pm

IX. Is this problem unfixable? | 2:15pm – 2:20pm

X. Is it worth using AI if this is a risk? | 2:20pm – 2:25pm

XI. Automated post-processing of AI output | 2:25pm – 2:30pm

XII. Manual post-processing of AI outputs | 2:30pm – 2:40pm

XIII. How JuristAI pre-empts and removes hallucinations from our work outputs | 2:40pm – 2:50pm

XIV. How lawyers can build a manual verification workflow (source of-truth retrieval, citation validation, and human-in-the-loop review) if they aren’t using software hallucination detection | 2:50pm – 3:10pm

More CLE Webinars
Upcoming CLE Webinars
Ethical Duties When Representing an Organization
Ethical Duties When Representing an Organization Thu, November 13, 2025
Live Webcast
Sketching Legacies: A guide to drafting wills and trusts
Sketching Legacies: A guide to drafting wills and trusts Thu, November 13, 2025
On-Demand
Live Replay
Diagnosing and Proving Traumatic Brain Injuries and PTSD
Diagnosing and Proving Traumatic Brain Injuries and PTSD Tue, November 18, 2025
On-Demand
Live Replay
Using AI in Your Law Practice: A Step-by-Step Guide
Using AI in Your Law Practice: A Step-by-Step Guide Thu, November 20, 2025
On-Demand
Live Replay
Abating Tax Penalties (2025 Edition)
Abating Tax Penalties (2025 Edition) Tue, November 25, 2025
On-Demand
Live Replay
Rules and Sanctions Related to Gen AI in Law
Rules and Sanctions Related to Gen AI in Law Fri, December 5, 2025
Live Webcast