Chief Justice Roberts Sees Promise and Danger of A.I. in the Courts
Chief Justice John G. Roberts Jr. devoted his annual year-end report on the state of the federal judiciary, issued on Sunday, to the positive role that artificial intelligence can play in the legal system — and the threats it poses.
His report did not address the Supreme Court’s rocky year, including its adoption of an ethics code that many said was toothless. Nor did he discuss the looming cases arising from former President Donald J. Trump’s criminal prosecutions and questions about his eligibility to hold office.
The chief justice’s report was nevertheless timely, coming days after revelations that Michael D. Cohen, the onetime fixer for Mr. Trump, had supplied his lawyer with bogus legal citations created by Google Bard, an artificial intelligence program.
Referring to an earlier similar episode, Chief Justice Roberts said that “any use of A.I. requires caution and humility.”
“One of A.I.’s prominent applications made headlines this year for a shortcoming known as ‘hallucination,’” he wrote, “which caused the lawyers using the application to submit briefs with citations to nonexistent cases. (Always a bad idea.)”
Chief Justice Roberts acknowledged the promise of the new technology while noting its dangers.
“Law professors report with both awe and angst that A.I. apparently can earn B’s on law school assignments and even pass the bar exam,” he wrote. “Legal research may soon be unimaginable without it. A.I. obviously has great potential to dramatically increase access to key information for lawyers and nonlawyers alike. But just as obviously it risks invading privacy interests and dehumanizing the law.”
The chief justice, mentioning bankruptcy forms, said some applications could streamline legal filings and save money. “These tools have the welcome potential to smooth out any mismatch between available resources and urgent needs in our court system,” he wrote.
Chief Justice Roberts has long been interested in the intersection of law and technology. He wrote the majority opinions in decisions generally requiring the government to obtain warrants to search digital information on cellphones seized from people who have been arrested and to collect troves of location data about the customers of cellphone companies.
In his 2017 visit to Rensselaer Polytechnic Institute, the chief justice was asked whether he could “foresee a day when smart machines, driven with artificial intelligences, will assist with courtroom fact-finding or, more controversially even, judicial decision-making?”
The chief justice said yes. “It’s a day that’s here,” he said, “and it’s putting a significant strain on how the judiciary goes about doing things.” He appeared to be referring to software used in sentencing decisions.
That strain has only increased, the chief justice wrote on Sunday.
“In criminal cases, the use of A.I. in assessing flight risk, recidivism and other largely discretionary decisions that involve predictions has generated concerns about due process, reliability and potential bias,” he wrote. “At least at present, studies show a persistent public perception of a ‘human-A.I. fairness gap,’ reflecting the view that human adjudications, for all of their flaws, are fairer than whatever the machine spits out.”
Chief Justice Roberts concluded that “legal determinations often involve gray areas that still require application of human judgment.”
“Judges, for example, measure the sincerity of a defendant’s allocution at sentencing,” he wrote. “Nuance matters: Much can turn on a shaking hand, a quivering voice, a change of inflection, a bead of sweat, a moment’s hesitation, a fleeting break in eye contact. And most people still trust humans more than machines to perceive and draw the right inferences from these clues.”
Appellate judges will not soon be supplanted, either, he wrote.
“Many appellate decisions turn on whether a lower court has abused its discretion, a standard that by its nature involves fact-specific gray areas,” the chief justice wrote. “Others focus on open questions about how the law should develop in new areas. A.I. is based largely on existing information, which can inform but not make such decisions.”
Read the full article Here