

After completing this lesson, you will be able to:
AI can generate case citations, but it may sometimes fabricate cases or cite outdated rulings. This mistake is known as:
Case Hallucination – where the AI produces a realistic-sounding but nonexistent case.
In legal work, using a fake or wrong citation can:
Therefore, every case citation must be verified manually.



Even if a case is real, it may have been:
Search the case title in Google Scholar (Case Law) or LawPhil, then look at later decisions citing it.
If later cases say “This ruling is no longer controlling” → the case is no longer good law.
“The Supreme Court ruled in Garcia vs. ABC Company, G.R. No. 195847 (2019), that employee social media posts cannot justify termination.”

Video:
Please complete this quiz to check your understanding of Lesson 3.2.
You must score at least 70% to pass.
This quiz counts toward your certification progress.
Click here for Quiz 3.2
Verifying case law is a non-negotiable professional requirement.
AI can accelerate research, but it cannot replace the lawyer’s responsibility to confirm accuracy.
Proper verification protects:
AI suggests. The lawyer verifies. The court decides.
Lesson 3.3: Using AI with Legal Databases (Lexis, Westlaw, Bloomberg)
Previous : Lesson 3.1 – Conducting AI-Supported Legal Research Correctly
© 2025 Invastor. All Rights Reserved
User Comments