(FOX 2) – Every courtroom drama references it: the truth. Probably because the truth is the foundation of the American legal system.
And right now, it’s under attack from a very ambiguous and uncertain enemy: Deepfakes.
Many people have seen examples of this from Morgan Freeman to music that was labeled as by Drake and The Weeknd. But what does it look like in the courtroom? Experts say artificial intelligence will likely one day manipulate evidence.
Some believe it already has.
“I’d be very surprised to find that there isn’t,” said Joe Tavares, an artificial intelligence expert. “I mean, forgeries have been around for thousands of years at this point. So we’ve just changed the medium.”
Decisions within the courtroom have potentially life-changing implications. But if the evidence is based on a so-called deepfake that’s been created by artificial intelligence and a defense attorney, the existential threat that is posed could be unmatched.
And attorneys are starting to take notice of the danger.
“I think it’s something that really cuts across the criminal and civil legal systems,” said Washtenaw County Prosecutor Eli Savit. “And I don’t think anybody should be sanguine about it. Or think that, you know, this is a good development.”
“The things that I’m reading about are the nervousness of the lawyers and judges, because imagine our juror system, looking at all the evidence with this type of skepticism,” said Mike Morse, a well-known personal injury attorney.
One of the first issues is the technology that can create a deepfake has evolved much faster than the tools that can be used to detect a deepfake.
“Currently, there is not really,” Tavares said about detecting them. “Like if you had a well-funded individual or state that wanted to do something like that, it’d be very difficult to tell the difference.”
The team at Oakland University that’s studying artificial intelligence warned the integrity of the court could be on the line.
“Deepfakes can disrupt the functionality of the courts,” said Prof. Khalid Malik. “Even if there’s something real, people may deny it, saying, oh, you know, what? It’s a deepfake.
“It’s a sort of cat and mouse game. So we, as a society, We need to join our hands together to solve these problems,” he added.
But solving the problem is easier said than done.
Attorney Mike Morse said one possible solution could lie in the vetting of evidence – a process called discovery.
“Usually, as a lawyer, somebody hands you evidence, and you think it’s real,” he said. “Or if the defense sends me a video of my client doing something for 30-plus years, it’s real. But now we are taking a closer look, we’re having to hire experts to look at video, recordings, photographs.”
For prosecutors, the overhaul of the discovery process presents another issue – the cost.
“If you are somebody that, you know, may not have a lot of money to pay an expert – imagine a landlord-tenant case, right? You might be behind the eight-ball and not able to prove that your evidence is exactly what it says that it is,” said Savit.
One possible route would be raising the bar, one that can make it into a courtroom. The Federal Rules of Evidence, which governs what standards must be met for evidence to be used was last updated in 2020.
Three years is an eternity for technology and the industry pushing A.I. to the limits has lapped the court system several times over. The problem of authenticating evidence is largely untested in the digital realm.
And then there are the jurors. They’re typically shown an orientation video describing their role in the legal system. One of them narrated by Hon. Shauna Dunnings says it’s a juror’s job to evaluate the evidence and make a decision based on the proof beyond a reasonable doubt.
But what happens when deepfakes warp how much a jury is willing to trust the evidence? The blurring of lines between common sense, reason, and speculation becomes even fuzzier to see through.
“When people are starting to see fake videos, they’re going to be skeptical, and we’re going to have to deal with that as lawyers,” said Morse.
“We trust the jury, we trust the judge, if it’s a trial in front of a judge, to be the finder of fact, to make a determination based on all the evidence,” said Savit. “But if some of that evidence is fake, and it does get in, that’s problematic for the administration of justice.”
Both attorneys agree that deepfakes have become more common as they’ve gotten easier to make. And that will make it harder for jurors.
“Juries might just naturally become more skeptical of photographic video evidence to begin with, and maybe that’s a good thing if we’re really seeing a prevalence of deep fakes,” said Savit.
The consequences could be devastating to the integrity of the court since it invites a prosecutor’s worst nightmare: a wrongful conviction.
“And if evidence that, you know, we might be relying on turns out to have been faked, it really raises the possibility that there could be more wrongful convictions,” Savit said.
In an effort to keep up with the evolving technology, the Massachusetts Institute of Technology created a test to help people see if they could see the difference between a deepfake and the real thing. You can try it out for yourself here.
As for what officials can do now, most agree that educating the public is the best defense as institutions work to catch up. Mostly because many jurors could potentially not know what a deepfake even is.
“I think it would be really good for the courts to kind of assemble like an expertise in that field and probably put them on staff, rather than relying on the prosecution or the defense to bring in their own experts,” Tavares said. “So that relies on general society to have a good understanding of what’s out there and what could be out there.”