AI: An Enhancement, Not a Replacement for Attorneys
A feature article in the Daily Business Review by Tripp Scott's Manooch T. Azizi and Paul May
Artificial intelligence (AI) undoubtedly has the potential to enhance attorneys’ work product. But AI cannot replace attorneys, at least not at this juncture.
Murphy’s Law—“if anything can go wrong, it will”—rears its ugly head at the least opportune times. AI is in its relative infancy. Too much could go wrong with this emerging technology to risk a potentially painful and expensive legal outcome. The following article explores some relevant issues:
AI Can Cause Issues in Drafted Business Agreements
AI could possibly draft a limited, simple, and straightforward legal document that could be appropriate. The problem is “simple” and “straightforward” are in the eye of the beholder. AI is good at giving direct answers to questions. However, the answer to most questions in the legal context is: “It depends.” AI appears to struggle in the grey areas and may hallucinate fictitious, misleading or incorrect terms or information, often with confidence.
Certain modern, pre-trained, large language models (the LLMs) designed for AI-driven automated statutory analysis were found to be unable to correctly solve complex tasks involving the cross-application of different sections of data with substituted values—common tasks for the preparation of a variety of legal and transactional documents—because there are simply too many subroutines, and the LLM’s reasoning inevitably falters by using a wrong value or applying it in the wrong place, and the end result is almost always incorrect. See Rohan Padhye,
AI-Driven Statutory Reasoning via Software Engineering Methods, MIT Computational Law Report (Sep. 10, 2024) . “... The technology’s propensity to create false, unrelated, 'hallucinated' content may be its greatest weakness. Major brands have repeatedly fallen victim to hallucination or adversarial prompting, resulting in both lost brand value and lost company value...The downside risks in AI hallucination are profound and equally impact individuals, businesses and society.” See Shomit Ghose,
Why Hallucinations Matter: Misinformation, Brand Safety and Cybersecurity in the Age of Generative AI, UC Berkley College of Engineering, Sutardja Center for Entrepreneurship &Technology (May 2, 2024).
Because AI Is Relying on End Users, the Information BeingPulled in May Not be Relevant
Using an AI application places an immense burden on the end user to independently know, understand, and input any and all relevant information necessary for AI to generate the intended result. The skill of an attorney is not to come up with the “right” answers, but rather, to listen discerningly and ask thoughtful questions to precisely:
Determine the parties’ intent;
Ferret out the underlying facts, complexities, and conditions of a particular business transaction, industry circumstances, or family situation;
Anticipate the relevant “what if” scenarios and zero in on potential risks and liability;
Maximize rights and remedies under the applicable laws of the jurisdiction in question;
Navigate and mitigate potential issues or discord;
Engage in back-and-forth negotiations of terms, which requires the ability to respond, often creatively, to changing and novel circumstances while zealously acting in a client’s best interest, and;
Deal with ethical issues—a concept current AI especially struggles to grasp, let alone replicate.
All of the foregoing requires an adaptive human intelligence, coupled with training and experience, that is notably distinguishable from that of the AI available today. For all the knowledge and information available from “scraping” the vast reaches of the Internet, AI lacks the nuance and judgment to simultaneously handle these issues even in “simple” cases.
The Use of Ambiguous or Incorrect Language Can Lead to Litigation
In a transactional agreement, for example, missteps by an AI applicable(e.g., use of ambiguous, wrong, or insufficient terms or misidentification of parties), or the omission of necessary and beneficial terms (e.g. industry or transaction specific rights, remedies or limitations of liability),or the hallucination of nonsense, could end up costing a business substantially more money down the road to correct or litigate issues or disputes that could have and should have been addressed or mitigated during the formation of an agreement.
Recently, a federal judge sanctioned two attorneys who submitted a legal brief that included six fictitious case citations generated by an artificial intelligence chatbot and mentioned the harm that, “the client may be deprived of arguments based on authentic judicial precedents. See: Matav v. Avianca, 678 F. Supp. 3d 443, 448 (S.D.N.Y. 2023).
For now, anyone pursuing business or personal objectives through the law or the court system would be well-advised to engage an attorney to complement—not jeopardize—their objectives.