Since 1975
  • facebook
  • twitter

AI will have its day in court — but human judges will always rule

The Legal Dimension of Artificial Intelligence (AI) is the third session theme of the International Conference on Justice, which is held at the Ritz Carlton in Riyadh. (Twitter @MojKsa)
The Legal Dimension of Artificial Intelligence (AI) is the third session theme of the International Conference on Justice, which is held at the Ritz Carlton in Riyadh. (Twitter @MojKsa)
Short Url:
06 Mar 2023 12:03:28 GMT9
06 Mar 2023 12:03:28 GMT9
  • Tech experts discuss pros and cons of using artificial intelligence in legal systems during Riyadh conference

Sulafa AlKhunaizi

RIYADH: Artificial intelligence could soon become a useful tool in courts but the world is a long way from algorithms passing judgment on humans, experts have said during an event in Riyadh.

Experts at the International Conference on Justice, at the Ritz Carlton, discussed how the technology can be applied safely during a debate titled the “Legal Dimension of Artificial Intelligence.”

Andrea Isoni, Director of the AI Technologies consultancy, told Arab News that there were many ways that it could be of benefit, but there are pitfalls to consider.

Lawrence Lessig, Professor of Law and Leadership at Harvard Law School. (Twitter @MojKsa)

“Preparation of documents, reading information from documents because those require a lot of reading, and extracting information from documents. All these processes are low-level security issues that (could) use the efficiency of AI,” he said.  

“The body of law is not ready yet to allow AI to judge people. Even in exams, if an AI scores you, then who is responsible? It is the same with the law and judging in court.

“Even if the technology is ready, the body of law needs to change substantially to determine who’s responsible. Someone has to be responsible if the AI goes wrong.”

Professor Ryan Abbott of the University of Surrey. (Twitter @MojKsa)

Isoni, who is also his consultancy’s Chief AI Officer, said that many countries’ judicial systems should look at AI to speed up court procedures.

In the session, speakers discussed the benefits and challenges of using AI in law.  

Lawrence Lessig, Professor of Law and Leadership at Harvard Law School, referenced an AI in California that had gathered enough legal knowledge to pass the state’s bar exam.  

The session was moderated by Andrea Isoni, Director and Chief Ai Officer at AI Technologies. (Twitter @MojKsa)

“In 10 years, this technology will make possible the automation of what I think, 75 percent of what lawyers do. The most important thing for us to do now is to make sure that humans retain control,” he said.

“The system, though it will automate the vast majority of what lawyers do, (must) preserve a role for judgment and justice and an opportunity for those who are wronged by the technology to right those wrongs.”

Christopher Markou, a PhD candidate in the Faculty of Law at the University of Cambridge, believes that AI may know the letter of the law but cannot capture its “spirit.”

“The spirit of the law is really something mushy and in a gray area. The part that requires interpretation, that requires a cultured individual to be able to help make sense of what this rule not just said, but what it really is meant to do or achieve in society,” Markou told the audience.

Professor Ryan Abbott of the University of Surrey said that governments must consider proper regulation of AI.

“You might have an AI that could give an equally good answer to a question as a human being, and we will have to address how the regulatory system should deal with that,” said Abbott.

“When the law treats people and machines differently in terms of their behavior, it sometimes has negative outcomes for human beings and social wellbeing.”

Anupam Chander, Scott K Ginsburg Professor of Law at Georgetown University Law Center, warned that it had been shown that machine-learning AI is prone to exacerbate biases already prevalent in society due to the material it learns from.  

He cited the example of Amazon’s AI hiring system, which the company later scrapped because it had learned to favor male applicants due to the material it had been fed.

“The AI was favoring men over women, and the reason was, it has been fed ten years of data of past residents which were heavily male and therefore not properly representing the characteristics that women might bring to the table.”

Abbott said AI has progressed in many areas, such as language and making music and art, but the law slipped into a gray area if an AI invents new technology without human interaction.

“If an invention does not have a human inventory, it cannot be patented,” Abbott said. “So, if a drug company could use a very sophisticated AI to find a new treatment for COVID, they could not get a patent on that drug, and they would not have the right incentives to commercialize,” he said.

So far, patent applications have been accepted in Saudi Arabia and South Africa and in pending status in some countries.  

“The Ministry of justice here and that regulators around the world are going to have to consider. What do we do when machines behave like people and how do we encourage machines to behave in ways that are socially useful?” Abbott added.

Most Popular
Recommended

return to top