A federal appeals court on Wednesday was exasperated that lawyers still file briefs with AI-generated fictitious case citations and other hallucinated content, stating the problem “shows no sign of abating.”
That observation was made when a three-judge panel of the New Orleans-based 5th U.S. Circuit Court of Appeals approved, opens new tab, Heather Hersh of FCRA Attorneys $2,500 after finding that artificial intelligence used a draft of a much briefer document she filed in a case and then failed to verify the accuracy of the AI-generated material.
Hersh, who declined to respond to a request for comment, had filed the brief as part of an appeal of sanctions award a judge had imposed on the attorney who founded her company, Shawn Jaffer, and his law firm, it was then known as Jaffer & Associates, in a lawsuit accusing a lender and credit reporting agency of Fair Credit Reporting Act violations.
In a case that a federal judge in Texas had ordered Jaffer and his law firm to pay the defendants a total of $23,000 in the form of attorneys’ fees because he concluded Jaffer had not made even a basic inquiry into the claims of his client before bringing the case into court.
Those sanctions order was subsequently reversed by the 5th Circuit. However, when it did so, it gave a show-cause order against Hersh upon finding 21 instances of fabricated quotations or gross misrepresentations of law or of fact in her brief.
The panel, led by U.S. Circuit Judge Jennifer Walker Elrod, said that Hersh’s response was “disappointing,” as she attributed her reliance to publicly available versions of the cases she said were accurate and named several of the more popular legal databases as the source of the inaccuracies.
Elrod termed Hersh’s response “not credible” and “misleading in several aspects and added that she admitted to using AI when she was later specifically asked whether she used it. Elrod told her that the court probably would have imposed fewer severe sanctions had she accepted responsibility and been more forthcoming.
Elrod wrote, “However, when confronted with a serious ethical misstep, Hersh misled, evaded, and violated her duties as an officer of this court.”
The judge added that such cases as Hersh has noted in his AI-hallucinated case citation “have increasingly become an even greater problem in our courts,” despite nearly three years of news stories about similar incidents since the first high-profile case in 2023.
She referred to one database kept by French lawyer and data scientist Damien Charlotin, which, as of Wednesday, contained 239 cases of AI-generated hallucination in United States filings made by lawyers.
Elrod reported that the 5th Circuit had discussed adopting a what would have been a first-of-its-kind rule at the appeals court that would regulate the use of generative AI by lawyers before it in 2024, but decided not to, as it felt that sufficient rules existed to govern lawyers.
Elrod added, “If it were ever an excuse to plead ignorance of the risks of using generative AI to draft a brief without verifying its output, it is certainly no longer so.”



