New Zealand Law Society - Beyond the hype: AI and the law

Beyond the hype: AI and the law

This article is over 3 years old. More recent information on this subject may exist.

To go by some media reports, one might be forgiven for thinking that the lawyer of the 21st century is destined for the same fate as the switchboard operator or the ‘knocker-upper’ in the 20th – rendered superfluous by the accelerating march of automation. In particular, artificial intelligence (AI) and machine learning are suspected of taking over a large part of what is presently done by human lawyers.

Our research has suggested that the future may not be quite so bleak. The practice of law will, however, change and, as with other forms of technological change, that change will not necessarily happen in an orderly or linear fashion. In this article, we suggest some of the areas where that change might be most pronounced.

AI and recruitment

The first major change for future lawyers may occur before they even have their first job. Automatic tools will assess job applications, and interviews may even be with virtual recruiters. The technology is of various types. Some systems operate on a candidate’s application materials. Text classification systems take a textual document (for instance, an application letter, a reference or a candidate’s biography), and assign it to one of a number of preset categories. Information extraction systems take a textual document, and attempt to fill in various more specific fields (for instance, a candidate’s skills, years of work experience, highest educational qualification). Ranking systems use the outputs of these systems to order applicants by their likely suitability for the job, to generate a shortlist for human recruiters.

Female robot

Some of these systems also consult documents found online, including (perhaps worryingly for some) a search of the candidate’s social media presence and can process pictures, using image classification techniques. There are also more interactive systems, that engage in real-time dialogues with candidates, and classify their contributions to this dialogue. Product developers claim these tools are much better than human recruiters at routine tasks and can: search through hundreds of applications faster, identify more qualified candidates, allegedly reduce hiring biases in screening interviews, and save time for routine tasks and pre-employment tests. There are also tools that perform other functions in HR: for instance, creating job descriptions, or matching candidates to those descriptions.

However, there are significant concerns about bias and discrimination embedded in the data used to train these tools. In one study, search engine results for “unprofessional hair for work” showed mainly results of black women with natural hair styles, while searching for “professional hair for work” offered pictures of coiffed white women. Online platforms have also been criticised for enabling employers to discriminate on grounds of age and for excluding female job seekers from recruitment campaigns through targeted job advertisements which were forwarded to men rather than to women.

New guidance has emerged on how to take the tools into account in a recruitment process. For example, when being interviewed by a virtual recruiter, some recruitment firms recommend taking a different approach from being interviewed by a human, such as using and repeating key words, referring to particular skills that the assistant will be processing and scoring, and being more aware of body language.

There have been calls for stronger ethics to guide development and deployment of these tools to avoid discrimination in recruitment processes. In New Zealand, the use of these tools has implications for employers’ Human Rights Act and Privacy Act obligations. For example, if the results of an algorithmic tool which used personal information was requested by an unsuccessful candidate, the results would need to be made available to a candidate in a meaningful form. In Naidu v Australasian College of Surgeons [2018] NZHRRT 234, Dr Naidu asked the College for access to personal information held about him in relation to an application for admission to a specialist medical training course. The request was not responded to within the statutory time period and, when finally complied with, the information included a score sheet with codes allocated to summaries of a referee’s views about Dr Naidu’s application. These scores were not in a form he considered meaningful (for example, what the score was out of or whether it was weighted).

The tribunal noted section 42(1)(c) and (d) of the Privacy Act require information to be made available in a “form which can be comprehended”. Considering the proper application of this section to the score results, the tribunal referred to the the European Union’s General Data Protection Regulation (GDPR) which introduced a number of new measures designed to strengthen protection of personal information in the context of automated decision-making, including the use of algorithmic processes. The tribunal ordered the summary coding information be made available to Dr Naidu in a “meaningful” way, namely, “in a manner that is transparent, intelligible and easily accessible”.

Contract drafting

Contract drafting is a rapidly developing field with a diverse range of legal products. The Australian company SmarterDrafter, for example, has developed a contract drafting tool connected to Google’s voice-activated internet search assistant, Alexa. SmarterDrafter works by using Alexa to ask a lawyer contract drafting questions (such as the names of the parties, type of agreement, the jurisdiction of applicable law and so on). Based on the lawyer’s verbal responses, Alexa searches, for example, company or address information and jurisdictional material, and then automatically prepares a draft contract which is emailed to the lawyer for review. Depending on the nature or complexity of the contract, the draft agreement can be emailed in anything from a few seconds to several minutes. These systems use text classification techniques (for understanding user’s answers to their questions), combined with natural language generation techniques (for creating a draft contract).

Document analytics

Document analytics is a growing area, particularly in contract and commercial law. Products such as ThoughtRiver can analyse complex contracts and related documentation in order to create a digital contract summary, provide a narrative preliminary assessment of legal issues, a summary of governance and risk issues, make recommendations for triage, work-flow and prioritisation as well as draft preliminary reports and suggest benchmarking for progress. They use a combination of text classification and information retrieval techniques, plus document summarisation techniques. These sorts of products can be used to provide summaries of a client’s exposure to legal risks and can also be useful in more complex document reviews.

Regulators are using these tools to enable speedy analysis of large volumes of documents and to speed up investigation and pre-trial evidential processes. In 2014, Britain’s Serious Fraud Office used algorithms to work through more than 30 million pages of documents disclosed by Rolls Royce in a discovery process in order to determine those that might be subject to privilege. Rolls Royce cooperated with the SFO, under court oversight, giving the SFO access to a vast array of documents and consenting to use of algorithms. The SFO reported the algorithmic tools took about one-tenth of the time as the 30 human lawyers who would have been needed for the task.

Virtual legal assistants

Virtual legal assistants based on natural language processing technology are becoming more common. For example, the Wellington Community Law Centre initiative, Citizen AI, has created RentBot which uses natural language dialogue systems to enable people to ask questions about tenancy issues. Inadequate access to legal information is a significant barrier to access to justice and service providers are routinely asked the same legal questions. These new tools have significant potential to improve access to justice by directly responding to routine legal questions without the need for a lawyer but with the quality of information based on thousands of previous similar questions and answers (much like a real time ‘Frequently Asked Questions’). Because these tools provide legal information, rather than legal advice, they do not appear to be regulated services within the meaning of the Lawyers and Conveyancers Act. Citizen AI has plans to launch an employment tool, WorkBot, later this year.

Glitched man

Virtual assistant tools allow a much wider range of people to have access to legal information at a fraction of current costs, and also enable legal skills to be focused away from repetitive questions into more complex or nuanced areas. In the future, law firms might also offer these services to their clients, directing them to readily available information to answer simple questions and facilitating their interaction with a lawyer in more complex cases or in prescribed circumstances.

Lawyers, too will soon be offered virtual assistance for some forms of legal research. LexisNexis announced it would be introducing a virtual assistant for online legal research, Lexis Legal Assistant, on its advanced platform in 2019. The bot would respond to written questions, save search results and be able to revisit previous research quickly. LexisNexis is also experimenting with a voice-activated legal search tool.

Technological developments are being closely watched by the courts, with judges in some jurisdictions being called upon to adjudicate related issues in both pre-trial processes and questions of costs. Two recent Canadian cases, for example, suggest that use of computer generated search results may be ethically required when carrying out legal research (see Cass v 1410088 Ontario Inc 2018 ONSC 6959 and Drummond v The Cadillac Fairview Corp Ltd 2018 ONSC 5350).

Whether or not use of computer assisted research might be ethically required in some cases, it seems clear that where algorithmic tools are used, a lawyer cannot rely solely on algorithmic results when giving advice to clients or in submissions to a court or tribunal. A lawyer would still have an overriding duty of care to ensure that reliance was reasonable and did not breach any of the other fundamental professional obligations of lawyers, for example, to facilitate the administration of justice or as an officer of the court.

Implications

We do not think that natural language processing and AI tools will result in the wholesale replacement of lawyers any time soon. Rather, AI will increasingly augment legal work and lawyers will increasingly be working with AI. The practice of law will change but, as with other forms of technological change, it is likely that this change will affect the law profession unevenly. Most New Zealand law firms are small and these firms do not have the same financial resources as large firms to invest and keep pace with technological change.

The Law Society of England and Wales recently surveyed its members on issues of new technologies and found most lawyers are not ready for these changes and do not think that they will need advanced skills in this area, such as statistics or coding. In New Zealand, continuing professional development education is compulsory and self-directed. The Lawyers and Conveyancers Act and associated Continuing Professional Development Rules do not contain a technology specific duty of competence. However, a lawyer might identify a learning need, for example, in relation to computer assisted legal research or the use of AI tools to aid in litigation.

Finally, concerns about the impact of new forms of technology on lawyers’ professional obligations, including client care, have prompted new or supplementary professional duties in some jurisdictions. For example, in 2012 the American Bar Association amended its Code of Conduct to introduce a ‘duty of technical competence’. Comment 8 on Model Rule 1.1 provides that to maintain “requisite knowledge and skill, a lawyer should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology …” (emphasis added). At least 35 states have since formulated rules of professional conduct that adopt the comment and model rule in some form.

Conclusion

Prophesising exactly how the next wave of automation will affect the world of work is something of a fool’s errand, and evidence and argument can be found to support almost any imaginable outcome. Nonetheless, our sense is that robots will not be replacing lawyers en masse any time soon. As Richard and Dan Susskind have said, though, it seems that the least likely future is one where nothing much will change. Lawyers need to begin to prepare for these changes, taking opportunities to use new products or to improve or learn new skills through legal or other education.

Joy Liddicoat joy.liddicoat@otago.ac.nz is a human rights and technology lawyer and member of the Artificial Intelligence and Law Project, along with Colin Gavaghan and Alistair Knott who are Associate Professors at the University of Otago. The project is funded by the New Zealand Law Foundation.

Lawyer Listing for Bots