close
close

Police Use AI to Create Reports. Will Those Reports Be Accepted in Court? – NBC New York

Police Use AI to Create Reports. Will Those Reports Be Accepted in Court? – NBC New York

A body camera captured every word and bark uttered as police Sergeant Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour.

Normally, the Oklahoma City police sergeant would have grabbed his laptop and spent an additional 30 to 45 minutes writing a report on the raid. But this time, he used artificial intelligence to write the first draft.

From all the sounds and radio conversations picked up by the microphone attached to Gilbert’s body camera, the AI ​​tool produced a report in eight seconds.

“It was a better report than I could have written, and it was 100 percent accurate. It was more fluid,” Gilbert said. The report even documented something he didn’t remember hearing: another officer’s mention of the color of the car the suspects had fled from.

The Oklahoma City Police Department is one of the few experimenting with artificial intelligence chatbots to produce early drafts of incident reports. Officers who have tried it are excited about the time-saving technology, while some prosecutors, police watchdogs and legal experts worry about how it could change a cornerstone of the criminal justice system that plays a role in determining who gets charged or jailed.

Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the leading U.S. supplier of body cameras, it could become what Gilbert describes as another “game changer” for police work.

“They become police officers because they want to do police work, and spending half their day entering data is just a tedious part of the job that they hate,” said Axon founder and CEO Rick Smith, describing the new AI product — called Draft One — as having the “most positive reaction” of any product the company has introduced.

“There are certainly concerns,” Smith added. He said district attorneys prosecuting a criminal case want to be sure that police officers — and not just an AI chatbot — are accountable for writing their reports, because they may be asked to testify in court about what they witnessed.

“They never want to hear an officer on the stand say, ‘The AI ​​wrote that, not me,'” Smith said.

Artificial intelligence is not new to police departments, which have adopted algorithmic tools to read license plates, recognize suspects’ faces, detect the sounds of gunshots and predict where crimes might occur. Many of these applications have raised privacy and civil rights concerns, and lawmakers have tried to put in place safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, safeguards to guide their use.

Concerns about societal bias and prejudice built into AI technology are just part of what Oklahoma City community activist Aurelius Francisco finds “deeply troubling” about the new tool, which he learned about through The Associated Press.

“The fact that this technology is being used by the same company that supplies the department with Tasers is alarming enough,” said Francisco, co-founder of the Foundation for the Liberation of Minds in Oklahoma City.

He said automating these reports would “facilitate the ability of police to harass, surveil and inflict violence on community members. While it makes the job of police officers easier, it makes the lives of Black people and people of color more difficult.”

Before testing the tool in Oklahoma City, police officials showed it to local prosecutors, who advised caution before using it in high-stakes criminal cases. For now, it is being used only for reports of minor incidents that don’t result in arrests.

“So no arrests, no misdemeanors, no violent crimes,” said Oklahoma City police Capt. Jason Bussert, who manages information technology for the 1,170-officer department.

That’s not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway told the AP that all of his officers can use Draft One on any type of case and that it has been “incredibly popular” since the pilot project began earlier this year.

Or in Fort Collins, Colorado, where police Sergeant Robert Younger said officers were free to use it for any type of report, though they found it didn’t work well when patrolling the downtown bar district due to an “overwhelming amount of noise.”

In addition to using AI to analyze and summarize audio recording, Axon experimented with computer vision to summarize what is “seen” in video footage, before quickly realizing that the technology wasn’t ready.

“Given all the sensitivities around policing, around race and other identities of the people involved, this is an area where I think we’re going to have to do some real work before we introduce it,” said Smith, Axon’s CEO, describing some of the responses tested as not “overtly racist” but insensitive in other ways.

Those experiences led Axon to focus directly on audio in the product it unveiled in April at its annual corporate conference for police officials.

The technology is based on the same generative AI model used by ChatGPT, developed by San Francisco-based OpenAI. OpenAI is a close business partner of Microsoft, Axon’s cloud computing provider.

“We use the same underlying technology as ChatGPT, but we have access to more buttons and dials than a real ChatGPT user,” said Noah Spitzer-Williams, who manages Axon’s AI products. By turning down the “creativity button,” the model sticks to the facts so it “doesn’t embellish or hallucinate in the same way that you would if you were using ChatGPT alone,” he said.

Axon doesn’t say how many police departments are using the technology. It’s not the only vendor; startups like Policereports.ai and Truleo offer similar products. But given Axon’s close relationships with the police departments that buy its Tasers and body cameras, experts and police officials expect AI-generated reporting to become more ubiquitous in the months and years to come.

An investigation was underway at a Beverly Hills middle school after some students used artificial intelligence to create nude images of classmates, NBCLA’s Bailey Miller reported on February 26, 2024.

Before that happens, legal scholar Andrew Ferguson would like to see more public discussion about the potential pros and cons of the practice. For one, AI chatbots’ large language models are prone to inventing false information, a problem known as hallucination that can add convincing and hard-to-detect falsehoods to a police report.

“I worry that automation and the ease of use of the technology will cause police officers to be less careful about how they write,” said Ferguson, a law professor at American University who is working on what is expected to be the first law journal article on the emerging technology.

Ferguson said a police report is important in determining whether an officer’s suspicions “warrant the loss of a person’s liberty.” It is sometimes the only testimony a judge sees, especially for minor offenses.

Human-generated police reports also have flaws, Ferguson said, but the question of which is more reliable remains open.

For some officers who have tried it, it has already changed the way they respond to a reported crime. They narrate what is happening so the camera can better capture what they would like to write down.

As technology advances, Bussert expects officers to become “more and more verbal” in describing what’s in front of them.

After Bussert loaded video of a traffic stop into the system and pressed a button, the program produced a narrative-style report in conversational language that included dates and times, just as an officer would have typed it from his notes, all based on the body camera audio.

“It only lasted a few seconds,” Gilmore said, “and it was done to the point where I thought, ‘I don’t have to change anything.'”

At the end of the report, the agent should click on a box indicating that it was generated using AI.

Nine humanoid robots have gathered at the AI ​​for Good conference in Geneva, Switzerland, where organizers are seeking to champion the idea that artificial intelligence can help solve some of the world’s biggest challenges.

—————

O’Brien reported from Providence, Rhode Island.

—————

The Associated Press and OpenAI have a licensing and technology agreement that allows OpenAI to access a portion of the AP’s text archive.