FEATURE-As Malaysia tests AI court convictions, some lawyers fear for justice

*Malaysia set to conclude trial of AI tool in sentencing*Authorities say AI solves cases cheaply, fairly and quickly

* Lawyers question lack of discretion and narrowness of dataset By Rina Chandran

BANGKOK, April 12 (Thomson Reuters Foundation) – Few cases shake Hamid Ismail after nearly two decades as a lawyer, but he was surprised when a man he was defending was convicted using an artificial intelligence tool in the Malaysian state of Sabah. Ismail knew that courts in Sabah and neighboring Sarawak were testing the AI ​​tool for sentencing recommendations as part of a nationwide pilot project, but he was worried the technology would be used before lawyers, judges and the public does not fully understand it.

There was no proper consultation on the use of the technology, and it is not provided for in the country’s criminal code, he said. “Our Code of Criminal Procedure does not provide for the use of AI in court…I think it is unconstitutional,” Ismail said, adding that the AI’s recommended sentence for his client for a charge of minor drug possession was too severe.

Courts in Sabah and Sarawak piloted software developed by Sarawak Information Systems, a state government company, which said at the time it held consultations during the process and took steps to respond. to some of the concerns raised. Around the world, the use of AI in the criminal justice system is growing rapidly, from the popular Chatbot mobile app DoNotPay https://donotpay.com to robot judges in Estonia adjudicating small claims cases, to mediators robots in Canada and AI judges in Chinese. law courts.

Authorities say AI-based systems make convictions more consistent and can eliminate case backlogs quickly and inexpensively, helping all parties to court proceedings avoid long, costly and stressful litigation. More than a third of government respondents to a global survey conducted last year by research firm Gartner said they plan to increase investment in AI-powered systems, including chatbots, facial recognition and data mining in all sectors.

This month, federal authorities in Malaysia are aiming to conclude their nationwide trial of AI sentencing tools, which they say “can improve the quality of judgment”, although not quite made clear how they will be used in the courts. A spokesperson for Malaysia’s Chief Justice said the use of AI in the courts was “still at the trial stage”, declining further comment.

BIAS, MITIGATING FACTORS Critics warn that AI risks entrenching and amplifying biases against minorities and marginalized groups, saying the technology lacks the ability of a judge to weigh individual circumstances or adapt to changing social mores.

“During sentencing, judges don’t just consider the facts of the case – they also consider mitigating factors and use their discretion. But AI can’t exercise discretion,” said Ismail told the Thomson Reuters Foundation. Considering aggravating and mitigating factors “requires a human mind,” said Charles Hector Fernandez, a Malaysian human rights lawyer.

“Sentences also vary with times and public opinion. We need more judges and prosecutors to handle growing workloads; AI cannot replace human judges,” he said. he adds. Seeking to address concerns that its AI software could lead to biases in sentencing, Sarawak Information Systems said it removed the “race” variable from the algorithm.

But while “such mitigations are valuable, they don’t make the system perfect,” said a 2020 report on the tool from the Khazanah Research Institute (KRI), a policy think tank. He also noted that the company only used a five-year dataset from 2014 to 2019 to train the algorithm, “which seems somewhat limited compared to the large databases used in global efforts.” .

Sarawak Information Systems could not be reached to find out whether it had since expanded its database. A KRI analysis of cases in Sabah and Sarawak showed that judges followed AI’s sentencing recommendation in a third of the cases, all of which involved rape or drug possession under the terms of the two-state pilot.

Some of the judges reduced the suggested sentences in light of mitigating factors. Others were toughened up on the grounds that they would not serve as a “strong enough deterrent”. “OPAQUE ALGORITHM”

Technology has the potential to improve the efficiency of the criminal justice system, said Simon Chesterman, a law professor at the National University of Singapore. But its legitimacy depends not only on the correctness of the decisions taken, but also on the way in which they are taken, he added.

“Many decisions could properly be entrusted to machines. (But) a judge should not entrust discretion to an opaque algorithm,” said Chesterman, senior director of AI Singapore, a government program. The Malaysian Bar Council, which represents lawyers, has also expressed concern over the AI ​​pilot.

When courts in the capital Kuala Lumpur began using it in mid-2021 to hand down sentences for 20 types of crimes, the council said it had “not given any guidance at all and we had no opportunity to get feedback from criminal law practitioners.” In Sabah, Ismail appealed his client’s sentencing recommendation by the artificial intelligence tool, which the judge followed.

But he said many lawyers would not contest, potentially giving their clients too harsh sentences. “The AI ​​acts as a primary judge,” Ismail said.

“Young magistrates may think this is the best decision and accept it without question.”

(This story has not been edited by the Devdiscourse team and is auto-generated from a syndicated feed.)

Comments are closed.