Benefits and Risks of Using AI
Artificial Intelligence (AI) has already impacted litigation with advancements in document review and analysis and the evolving use of AI continues to grow at a very rapid pace. For example, AI is being used in the legal industry for contract review and analysis, predictive analysis to forecast litigation outcomes and to quickly sift through vast amounts of data to identify relevant information, reducing the time and costs involved for these tasks.
The arrival of ChatGPT, which is a generative AI tool that can produce clear, coherent human-like text in response to a question is anticipated to provide further benefits and efficiencies to attorneys by assisting with legal research, analyzing and summarizing key information from lengthy documents, and producing a first draft of various legal documents. All of these are clear benefits for increasing efficiency and lowering costs.
However, there are many known and unknown risks associated with using generative AI in the legal industry such as not fact checking the output of generative AI tools, confidentiality and security, bias and copyright issues. In addition, attorneys who use generative AI need to determine if they are required to provide notice to the court.
Sanctions for Relying on Hallucinations
Recently, attorneys have used ChatGPT to conduct legal research. However, gen AI tools like ChatGPT may provide incorrect answers, make up laws or cite cases that do not exist, which are termed “hallucinations” in the AI world. Unfortunately, this has led some attorneys to cite to nonexistent case law generated by AI in documents filed with the court and consequently led to sanctions:
- Mata v. Avianca, Inc., 2023 WL 4114965, at *3, *9 (S.D.N.Y. June 22, 2023) (imposing sanctions on attorney who used ChatGPT for legal research even though attorney was not aware that ChatGPT could make up cases, and failed to check whether the citations were real or accurate)
- Park v. Kim, 2024 WL 332478 (2d Cir. Jan. 30, 2024) (referring attorney to grievance panel for relying on ChatGPT without checking its results and for citing non-existent decision in reply brief)
- People v. Crabill, 2023 WL 8111898, at *1 (Colo. O.P.D.J. Nov. 22, 2023) (suspending attorney for violating various ethical rules by failing to check cases provided by ChatGPT)
AI Orders Requiring Certification on the Use of AI for Court Filings
In response to the growing use of generative AI in court filings, several judges from the federal courts have enacted orders requiring counsel to affirmatively disclose or file certifications regarding their use of generative AI:
- Judge Brantley Starr of the U.S. District Court for the Northern District of Texas was the first to issue a standing order requiring counsel to file a certificate “attesting either that no portion of any filing will be drafted by generative artificial intelligence (such as ChatGPT, Harvey.AI or Google Bard) or that any language drafted by generative artificial intelligence will be checked for accuracy….”
- Magistrate Judge Gabriel A. Fuentes of the U.S. District Court for the Northern District of Illinois issued a standing order to address “the fast-growing and fast-changing area of generative artificial intelligence (AI) and its use in the practice of law” and requiring “[a]ny party using any generative AI tool in the preparation or drafting of documents for filing with the Court must disclose in the filing that AI was used and the specific AI tool that was used to conduct legal research and/or to draft the document.”
- Judge Stephen Alexander Vaden of the U.S. Court of International Trade issued an Order on Artificial Intelligence requiring disclosure of any generative AI program used and of all portions of text drafted with the assistance of generative AI, as well as certify that the use of the generative AI tool did not disclose confidential information to unauthorized parties. Judge Vaden expressed his concern that generative AI tools “that supply natural language answers to user prompts, such as ChatGPT or Google Bard, create novel risks to the security of confidential information.”
- Judge Michael Baylson of the U.S. District Court for the Eastern District of Pennsylvania has issued a broader order requiring the disclosure of any type of AI as opposed to limiting the disclosure requirement to the use of generative AI. Thus, this broader requirement could encompass the disclosure of AI uses that have been predominantly used by attorneys such as technology assisted review or research tools such as Westlaw or Lexis.
- Magistrate Judge Peter Kang of the U.S. District Court for the Northern District of California issued a standing order that addresses the difference between generative AI tools and tools that use other categories of AI and stating that the disclosure requirement does not apply to the use of traditional AI such as “the use of traditional legal research, word processing, spellchecking, grammar checking or formatting software tools (g., Lexis, Westlaw, Microsoft Word or Adobe Acrobat).” Judge Kang’s order addresses the disclosure requirements for AI and filings with the court, AI and confidentiality, AI and evidence (recognizing that “AI-generated documents or materials (for example, created by a party prior to the commencement of litigation) are or may become exhibits, evidence or the subject of factual disputes in an action”),
Key Takeaways
- As the landscape for generative AI continues to evolve, it is important that attorneys and pro se litigants who use generative AI check the accuracy of the answers provided by the generative AI tool before relying on it in a document filed with the court.
- Counsel and pro se litigants who use generative AI must pay attention to the “novel risks” posed by using generative AI as Judge Vaden highlighted such as the potential for confidential information being submitted through prompts to an AI program.
- Attorneys and pro se litigants need to carefully review the applicable orders and local rules concerning the use of and reliance on generative AI in court filings and any disclosure requirements imposed by the courts.