Experienced arbitrators and counsel well know that the FAA provides that an award may be vacated where it was procured by “undue means,” or the arbitrator was guilty of “misbehavior,” or “imperfectly executed” his or her powers. 9 U.S.C. § 10 (a)(1), (3), and (4). If you have thought to yourself that “nothing like that will ever happen to me,” using generative AI should make you think again as illustrated by three 2025 appellate cases in which non-existent cases were cited.
In Kohls v. Ellison, No. 24-cv-3754, WL 66514 (D. Minn. 2025), defendant retained as an expert witness the co-director of Stanford University’s Cyber Policy Center for the purpose of submitting a statement in support of a state ban on the dissemination of election-related AI generated content. The opposition objected that the expert’s statement included several AI generated fake citations. The federal district court not only sustained the objection but declared that the expert’s statement had been made “under penalty of perjury with fake citations.” Further, the court refused to allow defendant to submit a corrected version of the statement. The judge “reminded” counsel that the Federal Rules of Civil Procedure require that the truth of all filings be verified, and again she refused to accept generative AI false statements “submitted under penalty of perjury.”
In Noland v. Land of the Free L.P., 2025 WL 26229868 (Cal. App. Sept. 12, 2025), the appellate court did not object to counsel filing a brief using generative AI but issued a $10,000 sanction because the brief cited irrelevant and even non-existent cases. The court stated plainly that “no brief, pleading, motion, or any other paper filed in any court should contain any citations — whether provided by generative AI or any other source—that the attorney responsible for submitting the pleading has not personally read and verified.”
And in Coomer v. Lindell, 1:22-cv-01129 (D. Colo. 2025), the Chief Judge, reviewing a defense brief, found nearly 30 defective citations which “most egregiously” cited non-existent cases, and she fined the defense legal team $3,000 each.
Generative AI has itself generated significant litigation sanctions including suspension from practice, fines, mandated CLE’s and fee reimbursement to the opposing party. Although arbitrators may not be perceived as practicing law, they would be well advised to consult ABA Standing Committee on Ethics and Professional Responsibility Formal Opinion 512. It refers to ABA Model Rule 1.1 which requires lawyers to be competent as well as Rule 3.1 requiring candor to the tribunal.
Moreover, Rule 1.6 provides that a lawyer must inform a client regarding the means by which the client’s objectives are to be accomplished. While parties to an arbitration are not clients of the arbitrator in the traditional sense, a court may find that the principle of Rule 1.6 pertains to the arbitration process, as the parties are entitled to know whether an arbitration award is the product of the arbitrator or generative AI or both. Are an arbitrator’s powers “imperfectly executed” when the arbitrator fails to disclose the use of generative AI to the parties? Take notice that at this stage of the generative AI phenomenon, it is critically important that when commercial arbitrators issue their reasoned awards, they keep a verifying human in the loop.
