Skip to Content

Hallucinations in Binary: Problems with Artificial Intelligence in Court

Hallucinations in Binary: Problems with Artificial Intelligence in Court

It’s every lawyer’s nightmare. You’ve drafted your brief late into the midnight dreary, poring over quaint and curious volumes of forgotten case law, racing against the ticking clock to meet your deadline. You have the perfect case to support your arguments—so perfect that it seems too good to be true. Before you have a chance to dwell on what that means, you realize that you’re minutes from the deadline. With shaking hands and no more time to think, you submit the brief just as the clock strikes midnight, sounding eerily into the night. As you sit in the midnight darkness, reflecting on how fortunate you were to find that perfect case, the phone rings— a shrill sound that snaps you from your thoughts. You answer, confused as to who could be calling at this hour.  A rasping voice at the end of the line whispers a revelation that makes your blood run cold.  Your “perfect case” never existed—‘twas just a phantom and nothing more.

Abandon All Claims, Ye Who Enter Here

Okay, maybe things aren’t as dramatic as we make them sound.  But as virtually everyone who is plugged into recent developments knows, generative artificial intelligence, or AI, is quickly advancing. As it advances, people are finding more and more uses for it. But while generative AI has many beneficial uses in all kinds of industries, it is not yet equipped to handle the complex and ambiguous question posed by legal practice. In fact, AI has a troublesome tendency to “hallucinate” case law, making it particularly dangerous in the legal context. From citing existing cases incorrectly to fabricating nonexistent cases altogether, current iterations of generative AI are rife with error due to these hallucinations.

The North Carolina Court of Appeals recently had a run-in with potentially AI-generated legal arguments in the case of Santree NC LLC v. Hines, a civil dispute between a tenant and a landlord. The tenant appealed an unfavorable summary judgment ruling, but her fatal mistake was violating Rule 28(b)(6) of the North Carolina Rules of Appellate Procedure by not including any citations to authorities upon which she relied in the body of her argument. The landlord pointed out this deficiency, to which the tenant filed a reply brief containing some legal citations. However, the citations she submitted either did not include the language she quoted or did not exist.  The Court of Appeals observed that “[t]he fabrications and miscited cases in Defendant’s brief strongly suggest the use of artificial intelligence,” noting further that “[m]any courts have noted the increased use of artificial intelligence in briefs, which can lead to numerous errors when not properly reviewed.” Ultimately, the Court of Appeals dismissed the appeal, concluding that the lack of relevant citations meant that the tenant had abandoned all her legal arguments.

Case Law from the Black Lagoon

The scariest part of Santree is that it is not an isolated incident. The Eastern District of North Carolina recognized in Letts v. Avidien Technologies, Inc. that “in courts across the nation, an apparent increased use of artificial intelligence technologies has given rise to citations to non-existent cases or legal citations.” That observation came just before the FTC’s admonishment of DoNotPay, Inc., an AI tool deceptively advertised itself as “the world’s first robot lawyer.” 

And it’s not just pro se parties turning to AI tools in legal contexts: multiple stories have come to light of lawyers admitting to and being sanctioned for citing fake cases. 

The Not-So-Uncanny Valley

These anecdotes all point to one thing: generative AI just isn’t ready to step into the courtroom yet. In fact, one study even found that AI models specifically designed for legal research are, at best, correct only 65% of the time.  Professionals should also be cautious not to commit ethical violations by overusing AI. Indeed, the State Bar of California, for example, has advised that “[o]verreliance on AI tools is inconsistent with the active practice of law and application of trained judgment by the lawyer.” 

Current problems don’t mean that generative AI will never have a place in this profession. AI can increase access to justice and the legal process, especially for those who cannot afford a lawyer, which is an end to be encouraged.  That said, the law is ultimately a human profession, and it’s clear that removing the human element can result in worse outcomes. To this end, U.S. Supreme Court Chief Justice John Roberts has urged “caution and humility” with respect to the integration of AI in law.  

While lawyers can rest a bit easier knowing that “Skynet: Attorney at Law” isn’t an imminent reality, it’s definitely still worth the extra bit of time to make sure a brief isn’t haunted by phantom cases.  In any event, from all of us here at the What’s Fair? Blog team:

Happy Halloween!

October 31, 2025 Suraj Vege