On March 11, 2026, the Fourth District Court of Appeal of Florida issued its per curiam opinion in Roussell v. Bank of New York Mellon, affirming the trial court’s judgment on the merits but writing separately to address a troubling issue: the appellant’s reliance on fabricated legal authority.[i] The court found that thirteen of the cases cited in the appellant’s brief did not exist and that nine additional citations, while referencing real cases, did not stand for the propositions the appellant attributed to them. The decision adds to a growing line of Florida appellate rulings confronting the consequences of artificial intelligence-generated legal research.
The underlying dispute arose from a foreclosure action brought by the Bank of New York Mellon against Samantha Roussell in Broward County’s Seventeenth Judicial Circuit. Roussell, proceeding pro se, appealed the circuit court’s ruling. In her appellate brief, however, she relied extensively on cases that the court determined were “hallucinated,” a term now widely used to describe fabricated outputs produced by generative AI tools. The Fourth DCA expressly disregarded these nonexistent authorities and affirmed the lower court’s decision without further discussion of the merits.
This decision is not the first time the Fourth DCA has addressed AI-generated citations. In Friend v. Serpa, the same court warned a pro se appellee who cited nonexistent cases that such “phantom authority” must be disregarded and that sanctions were available under the Florida Rules of Appellate Procedure.[ii] Earlier, in Goya v. Hayashida, the Fourth DCA found that a pro se party’s answer brief was “replete with and entirely supported by fake cases and legal propositions, presumably generated by artificial intelligence.”[iii] Taken together, these decisions reflect the Fourth DCA’s increasing frustration with litigants who submit AI-generated authority without verifying its accuracy.
The problem of AI hallucinations in legal filings extends well beyond Florida. The issue first gained national attention in 2023 when a federal judge in the Southern District of New York sanctioned two attorneys in Mata v. Avianca, Inc. for submitting a brief that cited entirely fictitious cases generated by ChatGPT.[iv] Since then, courts across the country have encountered similar filings from both attorneys and pro se litigants who failed to verify AI-generated outputs. In response, the Florida Bar issued Ethics Opinion 24-1 in January 2024, confirming that lawyers may use generative AI but emphasizing that they remain fully responsible for the accuracy and competence of all submissions to the court.[v]
What makes Roussell particularly notable is the sheer scope of the inaccurate citations. The court identified thirteen wholly fabricated cases and nine real cases cited for propositions they do not support, totaling twenty-two deficient citations in a single brief. The court emphasized that any party, whether represented by counsel or proceeding pro se, bears responsibility for the content of submissions to the court. While the panel declined to impose sanctions, it expressly noted its authority to do so under Florida Rule of Appellate Procedure 9.410(a) for noncompliance with Rule 9.210(c), which governs the content and form of appellate briefs.[vi]
For Florida practitioners, Roussell signals that the Fourth DCA’s patience with AI-fabricated authority is wearing thin. Each successive opinion, from Goya to Friend to Roussell, has included progressively sterner warnings. The court’s repeated reminders that sanctions remain available suggest that future litigants who submit hallucinated citations may not receive the same leniency. As generative AI tools become more accessible, both attorneys and self-represented parties must treat their outputs as a starting point for research, not a substitute for it. The obligation to verify every citation remains squarely on the individual who signs the brief.
[i] Roussell v. Bank of N.Y. Mellon, No. 4D2025-1309, 2026 WL 681054, at *1 (Fla. 4th DCA Mar. 11, 2026).
[ii] Friend v. Serpa, 425 So. 3d 51, 51 (Fla. Dist. Ct. App. 2025).
[iii] Goya v. Hayashida, 418 So. 3d 652, 656 (Fla. Dist. Ct. App. 2025).
[iv] Mata v. Avianca, Inc., 678 F. Supp. 3d 443, 448 (S.D.N.Y. 2023).
[v] Fla. Bar Ethics Op. 24-1 (Jan. 19, 2024).
[vi] Fla. R. App. P. 9.410(a); see also Fla. R. App. P. 9.210(c).