- The RYTR’s Block: How the FTC Wrote Off an AI Tool and the Consequences for AI Innovation
- Employee Use of AI Could Make Your Company’s Work Product Worse. What are the Legal Risks?
- Two Distant Businesses With Similar Names Butt Heads on the Internet: A Tale of Woe and Lessons Learned
- The FTC’s Ban on Noncompetes Might Die in Court, But the Legal Tide Still Runs Against Them
- Can You Cash in By Claiming a Trademark on a Trending Nickname, Slang Word, or Phrase?
Latest Blog Posts
Using an AI Service for Business Contracting: Are We There Yet?
Since the public launch of ChatGPT late last year, I have been inundated with solicitations from companies offering new AI tools for business contracting. Are they ready for prime time?
These AI products generally claim they can do three things: summarize a contract, review a contract for certain kinds of terms (e.g., limitation on liability), and suggest amendments to a contract based on policies you load into the system (e.g., your company is the vendor, and you wish to limit its maximum liability to the total amount paid).
Technologically, AI for contracting has strengths and weaknesses that make it promising for some situations but unlikely to be helpful (or even counterproductive) in others. Based on talking with people who have expertise in AI technology, here are some predictions on usefulness.
Contract Summarization
Of the three things a company might call on an AI contracting product to do (contract summarization, review for critical terms, and language revision), the AI will be weakest in summarizing.
Technologically, to perform summarization, the AI will review the contract to try to ascertain which words are most important and then eliminate unimportant parts to generate a summary. In doing so, it might make critical mistakes, such as not making proper connections when encountering indefinite pronouns or other indefinite words, such as stand-alone use of “this” to refer to something else in the contract. It might also omit a key use of a negation term (such as “not”) because of its placement. It might misreport a key quantitative value (e.g., a monetary amount, the number of days to do something) because its summarization cuts out key details or qualifiers.
Thus, it’s risky to rely on an AI-generated summary to report the critical aspects of a contract accurately.
Contract Revision
The AI should do well at identifying important contractual terms and suggesting revisions to them to comply with the contracting policies you input into the system. It ought to be able to use various tools, including computer thesauruses, to rarely fail in flagging the types of terms you want to find and in substituting language containing your preset policies.
But are you willing to trust the computer not to miss or mishandle crucial issues? An attorney can check the replacement language quickly by looking at the document’s changes in the redlining. But checking to see if the AI flagged all the important contractual terms would require an attorney to review the whole document, which cuts down on savings. Thus, your savings will depend on your risk tolerance.
Best Situations for AI Usage
AI contracting will work best when the type of contracts processed are commonplace. The type of contract must be commonplace for the training data absorbed by the AI to give adequate guidance to it.
If the contracting situation is rare, such as a contract containing custom terms not frequently found in contracts, the AI may struggle. It might do poorly at flagging important terms because it doesn’t know what is important in uncommon situations. It also might do a poor job of suggesting revised language.
Beyond that, the AI might be helpful to a contracts lawyer as a brainstorming buddy. For example, if the lawyer asks an AI to draft a contract of a particular type, the AI might suggest terms the lawyer wouldn’t think of or remember to include.
AI Won’t Help with Custom Business Terms
An AI likely will provide no or perhaps negative value concerning custom business terms. A contract is not just boilerplate, such as termination rights, limitation of liability, and confidentiality. A contract usually contains business terms, such as prices, timelines, and descriptions of deliverables and services.
Unfortunately, those business terms are often vague or incomplete because they are written by business managers rather than lawyers. The AI isn’t going to be materially helpful in identifying or fixing those inadequacies. When contractual relationships fail, it’s often because of a lack of clarity and detail in business terms.
Consider Bias
If the training data used by the AI consists of a corpus of contracts slanted toward one side of a deal, the AI might do poorly in suggesting revisions appropriate to the interests of the other side of the deal.
For example, if you represent a borrower in negotiating a commercial loan agreement, and if the AI learned on commercial loan agreements drafted by lenders, it might do a poor job of flagging important terms and suggesting appropriate revisions that protect the borrower. It should still do a good job in handling specific contracting policies you input into the AI, such as striking any chance of a confessed judgment. But, due to training bias, the AI might miss issues you didn’t put in your contracting policies.
A longer version of this column (more information) is available on John Farmer’s Substack. You can view and subscribe to that Substack here:
https://johnfarmer.substack.com/
Written on April 19, 2023
by John B. Farmer
© 2023 Leading-Edge Law Group, PLC. All rights reserved.