Authenticating AI Evidence in Florida: Closing the Gaps

Authentication is the foundation of admissibility. Under § 90.901, evidence must be authenticated or identified before it can be admitted, a requirement satisfied by showing enough proof that the evidence is what the proponent claims it to be.[i] In Florida, this threshold is relatively low. Courts allow authentication through a variety of means, including appearance, content, substance, internal patterns, or other distinctive characteristics, either supported by extrinsic proof or through self-authentication. However, the rise of artificial intelligence (“AI”) poses new challenges to this framework. AI-generated content can appear highly convincing while bearing no connection to reality. Without stronger safeguards, courts risk admitting unreliable AI-based evidence that slips through under the current low bar.
Florida courts are finding ways to deal with digital evidence by adapting old rules to new technology, but they also make sure not to weaken important protections like authentication, privacy, and fairness in the process. Florida’s Evidence Code § 90.901 requires that evidence be supported by proof sufficient to show that it is what its proponent claims, and courts have applied this requirement with notable flexibility. In Lamb v. State, the Fourth DCA upheld the admission of a Facebook Live video authenticated through forensic testimony, reflecting the judiciary’s willingness to admit digital content when properly connected to its source.[ii] Similarly, in Tracey v. State, the Florida Supreme Court required warrants for cell-site location data, adapting constitutional protections to the realities of digital surveillance.[iii]Additionally, in Wilsonart, LLC v. Lopez, while the Court declined to create a narrow exception for video evidence, it signaled its intent to adopt the federal Celotex summary judgment standard.[iv] An intent later formalized in the amendment of Rule 1.510, reshaped Florida’s summary judgment practice giving judges greater authority to grant judgment when one side has no real evidence.[v] Taken together, these decisions illustrate how Florida courts balance adaptability with restraint, extending established doctrines to address digital evidence without undermining the foundational protections of authentication, privacy, and procedural fairness.
Even with safeguards, authentication is vulnerable. Parties are not required to disclose whether evidence is AI generated and contextual markers like metadata can be fabricated. Proprietary algorithms may avoid Daubert review if methods are withheld (e.g., facial recognition software). Unlike federal courts, Florida lacks self-authentication rules for electronic records, forcing reliance on live testimony for even routine digital files.[vi]
The judiciary has already flagged AI concerns. In 2023, the Office of the State Courts Administrator warned that deepfakes and synthetic media could strain authentication standards. Florida’s 2025 procedural reforms, such as proportional discovery[vii], revised summary-judgment timelines[viii], and the Supreme Court’s ongoing review of Evidence Code amendments (SC2025-0659) create an opening to confront AI’s impact directly.[ix]
Florida can act before AI evidence overwhelms its courts. While Florida Statute § 90.901, Daubert, and contextual proof offer a baseline, stricter disclosure and reliability rules in the pending Evidence Code amendments are needed to keep synthetic evidence out of the record.
[i] Fla. Stat. § 90.901 (2025).
[ii] Lamb v. State, 246 So. 3d 400, 413 (Fla. 4th DCA 2018).
[iii] Tracey v. State, 152 So. 3d 504 (Fla. 2014).
[iv] Wilsonart, LLC v. Lopez, 308 So. 3d 961 (Fla. 2020); Celotex Corp. v. Catrett, 477 U.S. 317 (1986).
[v] See Fla. R. Civ. P. 1.510 (2025).
[vi] See Fed. R. Evid. 902(13)–(14); see also Daubert v. Merrell Dow Pharm., Inc., 509 U.S. 579, 113 S. Ct. 2786 (1993) (allowing judges act as evidentiary gatekeepers grounded in scientifically valid methodology as demonstrated through factors like testability, peer review, error rates, standards, and general acceptance).
[vii] Fla. R. Civ. P. 1.280 (2025)
[viii] Fla. R. Civ. P. 1.510 (2025)
[ix] In re Amends. to the Fla. Evidence Code § 90.404(2)(c), No. SC2025-0659 (proposed Apr. 2025).
3M corporation, long a staple of American industry, has recently been rocked by a string of lawsuits related to defective earplugs it sold to the U.S. military in what has been termed the largest multi-district litigation in U.S. history.1 As of the writing of this article, plaintiffs (all former U.S. military veterans have been awarded a staggering combined amount of $160 million in six bellwether trials.2 The entire controversy stems from 3M’s 2008 acquisition of Aearo Technologies, the company which manufactured the earplugs.3 The reason Aearo’s design of earplugs appealed to the U.S. military resulted from its innovative design: the dual-ended design allowed one end to block out as much sound as possible while the other end protected a user’s ears from extremely loud noises such as gunfire or explosions.4 This design was advantageous because it allowed troops to communicate with fellow soldiers nearby during loud situations.5
The Religious Land Use and Institutionalized Persons Act (RLUIPA) of 2000, provides that no substantial burden can be imposes on the religious exercise of a person, including state prisoners, unless the imposed burden is in furtherance of a compelling governmental interest and is the least restrictive means of furthering that compelling governmental interest. However, RLUIPA only protects a prisoner’s requested accommodation when it is sincerely based on a religious belief and not some other motivation.
Android, built on Java software code, provides power to a majority of the mobile devices in the world. Android was developed and run by technology super house Google. Google states that they used elements of this Java code for operations. This methodology, originally spun by the technology giant Oracle, led to a public feud in 2011. Oracle claimed Google had copyrighted exact lines of a Java code they had created to complete their technology development. Google LLC v. Oracle Am., Inc., 141 S. Ct. 1183 (2021).
The Establishment Clause is a clause that comes from the First Amendment of the United States. The First Amendment states that “Congress shall make no law respecting an establishment of religion or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances”.