subscribe Support our award-winning journalism. The Premium package (digital only) is R30 for the first month and thereafter you pay R129 p/m now ad-free for all subscribers.
Subscribe now
Picture: 123RF/SEMISATCH
Picture: 123RF/SEMISATCH

As far back as 2018, the Harvard Business Review raised the question of whether artificial intelligence (AI) was changing contracts. Five years may not fit the criteria of “long term”, but with the pace at which technology is evolving it might as well be decades.

As remains the case today, the Review cited efficiency as a key driver for the growth of AI in contracting. “It has been estimated that inefficient contracting causes firms to lose between 5% to 40% of value on a given deal, depending on circumstances. But recent technological developments like AI are now helping companies overcome many of the challenges to contracting.”

Global legislators are struggling to keep pace with technological development — the case law around accountability in social media is only now coming into effect, 25 years after its advent. So, it is clear that while legislators scramble to grapple with the use of AI, the question of ultimate accountability is going to be fundamental. 

Recent examples of the inappropriate use of AI include a lawyer in New York referencing hallucinated cases in court — “hallucinated” being a term for AI inventing fictitious references — and directors using AI to summarise board packs for them to save time. The challenge exists in the expectation of the application of the individual’s expertise and knowledge (and the degree thereof), as well as their intent to consent. Directors, for example, are specifically appointed for ultimate accountability, and the expectation is that they will apply themselves to the fiduciary responsibilities they carry. 

Online legal services ecosystem LegaMart recently noted that contract drafting, contract management, automated due diligence, contract review and analysis, and enhanced contract negotiation all offer immense time-saving and productivity benefits in the legal field. A McKinsey analysis concurs, confirming that the use of AI to draft contracts cut the time required from lawyers to 75%. However, a person, with knowledge and experience is still required to validate that the AI has performed appropriately, meaning that there should still be an accountable person, natural or juristic. 

Clearly, the rise of AI in contracting raises some poignant ethical questions, such as: where does the accountability lie? Should a dispute arise, how do we hold AI accountable? Who would be responsible if AI contracts with someone? If damages occur, how do you claim from AI? This reaches further to the automated process of collecting electronic signatures (or eSignatures). Should a contracting party consent to interact with an AI in the same way that they are often required to consent to sign anything electronically? 

LegaMart believes there is an ethical responsibility for human intervention and oversight to remain priorities in the contracting process: “A lawyer or other professional shall exercise due caution and reasonable care while using AI-based software in their regular usage. They should not compromise the competence, credibility, or quality of the contract. They should proofread the final draft and not completely rely on the AI.” But for how long? The world is continuously pushing these boundaries as human involvement diminishes in automated processes.

Another consideration is the enforceability of agreements and rights transfers; take NFTs as an example. Known as nonfungible tokens, these “smart contracts” are based on blockchain technology. NFTs offer a unique approach to contracting, deploying smart contracts, and offering a distinctive ownership record which is impossible to forge. While these smart contracts claim to be the answer to protracted and cumbersome bureaucracy, even an NFT is only as good as the memorandum of incorporation that underpins its issuance.

NFTs don’t issue rights (or responsibilities) unless they are backed up by a written contract; so, essentially, a smart contract may still need a written contract to be enforceable. Any derivative rights, or consents, extended beyond those initially laid out in the smart contract will need to be defined, agreed, and recorded; this is where simple, digital eSignature processes help complement the process. 

The reality is that — AI or no AI — NFTs and smart contracts won’t liberate us from the requirement to determine and enforce ultimate accountability, and informed consent. They can’t protect parties to a contract in a society where lawyers prosecute on possession/ownership. There may be a lot of hype around NFTs and AI, but identity will always remain a critical component when contracting. The purpose of a contract is always to set out the ultimately responsible person. For this, eSignatures are still at the forefront of determining identity with great technical certainty, clearly demonstrating who the responsible parties are in every transaction.

• Peter is MD of Impression Signatures. 

subscribe Support our award-winning journalism. The Premium package (digital only) is R30 for the first month and thereafter you pay R129 p/m now ad-free for all subscribers.
Subscribe now

Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.