Professional growth          Court news           Productivity           Technology          Wellness          Just for fun

The AI lawyer: will it ever happen?

While artificial intelligence (AI) has certainly caused major changes in the legal industry, one big question remains — will an AI tool be able to completely perform the work of a human attorney? In other words, is the AI lawyer a realistic concept that we will see in our lifetimes?

To this writer — a former litigation attorney who has been writing about legal tech for years — the AI lawyer still appears to be a distant fantasy.

There is no doubt that AI tools have excellent use cases for the legal profession. Nonetheless, AI still cannot replace the human element that makes attorneys indispensable, and this does not appear to be on the horizon any time soon.

Here’s why I think so:

Advantages of AI in legal practice

First, we need to understand the benefits AI has already brought to the legal industry.

AI contract review can spot issues and errors a flesh-and-blood legal professional sometimes misses.

With eDiscovery options like predictive coding, AI-powered tools can dramatically cut down on the time it takes to sort and review documents.

Legal research, too, took a big step forward when AI went beyond simply matching keywords and actually started to discern the meaning of words and phrases.

The ultimate effect of AI used in these contexts is to elevate attorneys’ work.

An AI tool can take over repetitive and mundane work. There are a lot of things that need to get done at a law firm, but they don’t require a sharp legal mind to do, right? When AI handles all those tasks, it allows attorneys to focus on the things that do need real expertise — case assessment, negotiation, counseling, advocacy, for instance.

It is important to note what AI is NOT doing in any of these scenarios — thinking, analyzing, or making independent assessments.

While the day could theoretically come when an AI tool can make case assessments on its own, that day appears to be far off. Artificial intelligence has come a long way, but it can still only handle simple, dedicated tasks. There’s a long way to go before an AI can truly think like a human.

AI disadvantages for the legal sector

When we discuss AI replacing lawyers, it is helpful to remember this idea comes from a very specialized type of AI — generative AI.

Generative AI tools such as ChatGPT create written output in response to queries. The output is based on large language models that train the tools to write more like a human.

The mechanics of generative AI result in several weaknesses for its use in legal work.

First and foremost, one key to being a competent lawyer is the ability to apply rules — some of which are flexible, and others hard-and-fast — to the messy business of life. An attorney needs to apply common sense and understand nuance, while also interpreting and applying legal principles.

This requires judgment and creativity, especially in the face of endlessly varied factual scenarios and novel applications of the law. Artificial intelligence can’t do that yet, and it might not ever.

Another issue with generative AI is its penchant for inaccuracy and “AI hallucinations.”

AI tools are prone to factual errors and even complete fabrications, such as fake case citations and judicial opinions. Generative AI is designed to provide responses that sound plausible, which makes it more likely these mistakes could be missed.

Privacy concerns are another issue.

Generative AI operates by collecting and analyzing large amounts of data submitted by users. If a legal professional input any private or confidential information pertaining either to a client directly or their legal matter, it could potentially be stored in the tool’s database or even used to train the tool.

Accordingly, it may be advisable to avoid inputting this information, but this obviously limits the effectiveness of the AI tool for legal work.

AI not likely to replace lawyers any time soon

The main reason AI is not likely to replace lawyers is that it cannot replicate something essential for the legal profession — the human element.

An AI tool cannot argue before a judge, present a case to a jury, or provide compassionate counsel to clients. All of these scenarios require human-to-human interactions, which will remain true for the foreseeable future.

Practically speaking, just consider how and when clients interact with attorneys.

A client may be facing criminal charges or a lawsuit, or they may have been catastrophically injured in an accident where another person was at fault. Their motivations will vary — perhaps they want to avoid the hassle and expense of a lawsuit and settle early, or they may want to press a legal matter to take a moral stand.

ChatGPT will simply not cut it here.

These clients will want a flesh-and-blood attorney to reassure them and provide guidance based on their wisdom and experience.

Another drawback of generative AI is that it does not provide the basis or sources for its conclusions.

As a former litigator, I can recall many instances of having to reassure clients repeatedly that a certain course of action was the best, often needing to explain my assessment in different ways. Generative AI has no methods of performing these functions.

In short, working with clients, opposing counsel, and judges requires an emotional connection that generative AI cannot replace. Science fiction novels, TV shows, and films may provide us with depictions of androids that can perform as intelligent, sentient beings. But we are currently a long way from Westworld, so lawyers will remain human for the time being.

Author

  • Mike Robinson

    After a fifteen-year legal career in business and healthcare finance litigation, Mike Robinson now crafts compelling content that explores topics around technology, litigation, and process improvements in the legal industry.

    View all posts