California Police Use AI to Transcribe Body Cam Videos

Read Time:5 Minute, 29 Second

(TNS) — A new front in the battle over the benefits of artificial intelligence versus its risks is opening up in law enforcement, where police departments are increasingly using the smart software to write up incident reports — to the concern of civil libertarians.

Earlier this year, the Fresno Police Department began experimenting with a product called Draft One sold by body-camera maker Axon that uses software from San Francisco’s OpenAI to transcribe video recordings taken on officers’ body-worn cameras and create a first draft of a police report. It’s the largest police force in California so far to try the technology.

Police departments in San Mateo, East Palo Alto and Campbell are also using Axon’s Draft One and have praised its speed and time-saving capabilities. The San Francisco Police Department said in a statement it is not using the technology.


Civil libertarians have misgivings about the nascent tool, raising concerns that AI could make mistakes as it works its way into the evidence room and the courtroom. At least one district attorney, in Washington state, has warned of the possibility of errors.

“We think police departments should not be using this technology, and that introducing novel AI technology like this in the criminal justice context raises a bunch of civil liberties and civil rights concerns,” said Matt Cagle, an ACLU of Northern California attorney who focuses on technology and civil liberties.

Fresno Deputy Chief Rob Beckwith said his officers are using Draft One under a pilot program and only to write up misdemeanor calls. So far, he said, the department hasn’t had any problems with errors in transcriptions.

“It’s nothing more than a template” for an officer to finalize, said Beckwith. “It’s not designed to have an officer push a button” and generate a report, he said, adding his department has consulted with the Fresno County District Attorney’s Office in training his 400-officer force to use the program.

“I’m hopeful that it expands” beyond the pilot, Beckwith said.

Fresno’s experience comes as many other police departments across the country — and increasingly in the Bay Area — turn to AI to expedite and automate parts of the paperwork-heavy criminal justice process.

Though Axon is the market leader, rival products are also popping up. A startup called Abel offers a similar tool that police are using in Richmond, according to a report in TechCrunch. Neither Abel nor Richmond could be reached to confirm the details.

Aside from transcription of body-camera footage and drafting police reports, some California public defenders and a handful of prosecutors are using AI systems to more quickly parse mountains of evidence — sometimes resulting in speedier trials.

Not all district attorneys are on board with the emerging technology. A Seattle-area prosecutor’s office recently warned local law enforcement against using AI programs like Draft One for creating police reports, saying the technology is still developing and can make small errors that are easily missed, even with human review.

Deputy Chief Beckwith said the goal is to halve the number of hours officers spend writing reports so they can spend more time on patrol or other duties. “The feedback we have gotten from officers has been they are saving a lot of time,” he said, though the department did not have specific metrics on how much.

The ACLU’s Cagle said the time savings aren’t clear and that using only an audio track to produce a police report opens the door for potential inaccuracies.

“Defendants have a right to confront their accusers and to scrutinize the case being made by the people in government who want to convict them of crimes,” he said. “When you introduce a computer product whose workings under the hood are not completely clear, you introduce serious accountability and transparency issues.”

Axon did not respond to a request for comment about how the program is trained or how often it makes errors.

During an online event Dec. 4 put on by the nonprofit Council on Criminal Justice to discuss the intersection of AI and criminal justice, Yasser Ibrahim, Axon’s executive for AI, said that programs like Draft One are not intended to replace human work. Instead, he said, the idea is to speed up repetitive tasks. The test of whether the technology is working well is, “does it make the right thing happen reliably?” Ibrahim said, or “does it make the bad thing happen more often?”

During the same event discussing AI’s role in criminal justice, UC Berkeley Law Professor Rebecca Wexler said she worried that better-resourced police and prosecutors using AI “might distort the adverse (legal) system if tools are developed for one side.”

That is where the startup JusticeText comes in. The company’s software is being used by mostly defense attorneys in places including Sacramento and Modesto and creates searchable transcripts of evidence in a case, such as body camera footage, 911 calls and police interrogations.

That kind of evidence is usually handed over by prosecutors in large tranches to defense attorneys such as Stanislaus County Public Defender Reed Wagner, who uses JusticeText to rapidly parse through it.

“You are immediately playing catch up as it compares to the prosecution,” when wading through voluminous evidence, Wagner said. The software allows him to move faster, and to zero in on the most important pieces of evidence. “It doesn’t change my responsibility,” Wagner added. “If I’m in a serious trial I still need to watch all of the footage” from a body-camera interaction for example, he said.

Wagner said all 30 attorneys in his office use the software, which costs $1,200 per attorney per year, to different degrees.

The same issues of accuracy raised by Cagle, the ACLU attorney, could exist when AI programs like JusticeText process footage or audio into transcripts. But using AI for criminal defense compared to for policing is different, Cagle said, since “good criminal defense attorneys are going to check the output of these kinds of systems every which way.”

Benefits aside, said Cagle, introducing “opaque” technology into the criminal justice system could have a range of unintended consequences. “The reliability issues that pervade gen(erative) AI products have not been conclusively fixed,” he said.

© 2024 the San Francisco Chronicle. Distributed by Tribune Content Agency, LLC.


Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous post Colorado County Outfits Fire Truck With Fentanyl Awareness Wrap
Next post New Alabama Grant to Help Traumatized Police Officers