A California appeals court recently imposed sanctions of $10,000 against an attorney for improperly citing to case law using artificial intelligence (“AI”). The briefs filed by the attorney contained more than 20 case citations that did not contain the quotes to which the attorney cited or support the positions the attorney took. A few of the cases cited did not even exist.
In Noland v. Land of the Free, L.P. 2025 WL 2629868 (Cal. App. 2025), the appeals court affirmed the judgment of the lower court dismissing claims made by a real estate broker against a property owner in connection with unpaid commissions. The appeals court called the appeal itself “unremarkable” but for the attorney’s improper use of AI. The attorney acknowledged that he relied on AI to support his legal citations without reading the cases but claimed that he was not aware of “AI hallucinations” – a term that refers to AI providing false or misleading information. The appeals court was unsympathetic and noted that AI hallucinations have become a frequent issue in court briefs filed in recent years, further noting that “it is a fundamental duty of attorneys to read the legal authorities they cite in appellate briefs or any other court filings to determine that the authorities stand for the propositions for which they are cited.”
AI is a powerful tool, but with great power comes great responsibility. The Noland case should serve as a warning to not only attorneys but to anyone utilizing AI to make sure the information that AI generates is actually true before relying on it.