The US Police use technology that records sound from their body cameras to write their incident reports in eight seconds.
Some police departments in the United States are experimenting with the use of artificial intelligence chatbots (IA) to prepare the first drafts of your incident reports.
A technology with the same generative AI model as ChatGPTextracts sound and radio chatter from a body camera microphone of the Police and can Prepare a report in eight seconds.
“Era a better report than I could have written ever and it was one hundred percent accurate. It was more fluid,” says Oklahoma City Police Sergeant Matt Gilmore.
The new tool could be part of a set of Expanding AI tools US police are already usingsuch as algorithms that read license plates, recognize faces of suspects or detect gunshots.
Few guidelines on using AI reports
Rick Smith, CEO and founder of Axon, the company behind the AI product called Draft One, said that AI has the possibility of eliminating the paperwork that the Police have to do so they have more time to do the work they want to do.
But, like other AI technologies used by police, Smith acknowledged there are concerns. He said they come mainly from district attorneys who want Make sure police officers know what’s in your report in case they have to testify in a criminal trial about what they saw at the crime scene.
The introduction of police reports generated by artificial intelligence It is so new that there are few, if any, rules guiding its use.
In Oklahoma City, they showed the tool to local prosecutorswhich They advised caution before using it in high-risk criminal cases.
But there are examples of cities elsewhere in the US where officers can use the technology in any case or as they see fit.
Concern over AI racial bias
Legal scholar Andrew Ferguson would like to see more public debate about the benefits and possible harms of this technology before it comes into force.
On the one hand, the large language models behind AI chatbots are prone to inventing false information, a problem known as hallucination What could I add? convincing falsehoods and difficult to notice in a police report.
“I worry that automation and the ease of technology will make police officers less careful when writing,” said Ferguson, a law professor at American University who working on what is expected to be the first legal review article about this emerging technology.
According to Ferguson, a police report is important in determining whether an officer’s suspicion “justifies someone’s loss of liberty.” Sometimes it is the only testimony a judge sees, especially in minor crimes.
According to Ferguson, Human-generated police reports also have flawsbut it is not clear which is more reliable.
The concern for societal racial biases and prejudices that are incorporated into AI technology It’s just part of what Oklahoma City community activist Aurelius Francisco finds “deeply troubling” about the new tool, which he learned about from the AP.
In his opinion, the automation of those reports “will facilitate the ability of the Police to harass, surveil and inflict violence on community members“.