Lawtech products may need to be directly regulated as they become more complex, and it is harder for lawyers to understand how the underlying algorithms work, new research has suggested.
The model of professional education and training will also need to change to adapt to new ways of working.
The paper by, and accompanying podcast with, leading academic Professor Lisa Webley, chair in legal education and research, and head of the law school, at Birmingham University, is the latest in a series on technology and regulation commissioned by the Legal Services Board.
The regulatory regime in England and Wales – which only restricts six areas of reserved legal activity to authorised lawyers – encouraged innovation but also put a lot of onus on clients to find out if the people they were using were regulated.
“It makes it difficult for anyone to judge the quality of goods and legal services being developed in a market that is very open and relatively unregulated.”
The potential for innovation in this kind of market also allowed for “a regulatory environment that does not protect clients”, she cautioned.
“When it comes to technology, there may need to be more thought on whether we can continue to operate a system where we have authorised individuals and entities, and then we have everyone else operating in a world where risk is shoved towards the client, as opposed to being held by the company that developed the technology or the non-authorised professional.
“That is probably the arena that regulators are going to have to walk into sooner rather than later.”
Product regulation may be needed as the technology became more complex, and it was harder for a professional to interrogate the results received through the underlying algorithm.
Algorithms have already proven troublesome in reaching sentencing decisions in the USA because of the bias often inherent in the historic data they use.
Professor Webley explained: “Problems with the algorithmic decision-making can only be challenged if the decision-making process can be understood and reviewed.
“The transparency of algorithms is currently insufficient to allow for many algorithmically rendered decisions to be subject to proper challenge. The inability effectively to understand and to challenge decisions made in this way would constitute a breach of natural justice grounds in many jurisdictions.”
She continued that while fact-based dispute may lend themselves to algorithmic decision-making, disputes that involve legal interpretation would still require human intervention if the common law was to continue developing.
“There are fears that if lawyers do not engage in a dialogic debate about how the law is to be interpreted in hard cases, then the law may become whatever the algorithm determines it to be over time.
“Law is more than a set of rules; it is a contested set of values that over time change as the societal context changes. If algorithms reach decisions on precedent, without dialogic debate about what the law should be, then the interpretation of the law may be stagnate.”
The academic said lawyers working with AI-assisted tools would need training and support to make the best use of these systems and also to enable them to dig behind machine-generated results, “to test the basis upon which they were reached and provide them with confidence to question the results so as to be able to assess the relevance and quality of these suggestions in meeting their clients’ needs”.
She said: “Legal professionals will still need a solid knowledge of law, practice and its application, well-developed critical analytical skills in order to do this, and an appreciation of the basis of data science.
“Professional education and training models may need to adapt to new ways of working. A traditional approach to apprenticeship, learning from those with more experience, may not serve the profession at times of intense change.
“It is possible that the profession may shrink as routine tasks are automated, but many clients may still need interaction with and support from human lawyers at a challenging point in their lives and if so soft skills may become even more important than ever.”
Professor Webley suggested that professional pathways within the legal profession may diverge further into “highly knowledgeable expert-systems lawyers, more routine service providers and legal technicians, which may require different regulatory approaches”.
She also raised the wider ethical issues involved, saying that the ethical underpinnings of regulation would have to take account not just of clients or consumer needs as purchases or users of legal services and the justice system, but “wider societal needs for a fair, legitimate and effective legal system”.
This meant that the tools lawyers use and the way in which they handle client data “need to be beyond reproach”.
She said: “Professionals are more than simply expert service providers. They must engage critically with the positives and negatives that digital tools, particularly tools that automate decision-making, bring and work with legislators and the courts to ensure that fundamental values are protected rather than eroded.”
Timely realisation of the impending challenges ahead and thanks for sharing your thoughts on the subject please