logoalt Hacker News

AI Police Reports: Year in Review

157 pointsby hn_ackerlast Tuesday at 5:30 PM115 commentsview on HN

Comments

futuraperditatoday at 7:06 AM

What worries me is that _a lot of people seem to see LLMs as smarter than themselves_ and anthropmorphize them into a sort of human-exact intelligence. The worst-case scenario of Utah's law is that when the disclaimer is added that the report is generated by AI, enough jurists begin to associate that with "likely more correct than not".

show 3 replies
eterevskytoday at 8:40 AM

I think whether any text is written with the help of AI is not the main issue. The real issue is that for texts like police reports a human still has to take full responsibility for its contents. If we preserve this understanding, than the question of which texts are generated by AI becomes moot.

show 4 replies
d1sxeyestoday at 9:26 AM

> That means that if an officer is caught lying on the stand – as shown by a contradiction between their courtroom testimony and their earlier police report – they could point to the contradictory parts of their report and say, “the AI wrote that.”

Normally, if a witness (e.g. a police officer) were found to be recounting something written by a third party, it would be considered hearsay and struck from the record (on objection).

It would be an interesting legal experiment to have an officer using this system swear to which portions they wrote themselves, and attempt to have all the rest of the testimony disallowed as hearsay.

show 1 reply
Manheimtoday at 7:58 AM

I find this article strange in its logic. If the use of AI generated content is problematic as a principle I can understand the conflict. Then no AI should be used to "transcribe and interpret a video" at all - period. But if the concern is accuracy in the AI "transcript" and not the support from AI as such, isn't it a good thing that the AI generated text is deleted after the officer has processed the text and finalized their report?

That said, I believe it is important to aknowlegde the fact that human memory, experience and interpretation of "what really happened" is flawed, isn't that why the body cameras are in use in the first place? If everyone believed police officers already where able to recall the absolute thruth of everything that happens in situations, why bother with the cameras?

Personally I do not think it is a good idea to use AI to write full police reports based on body camera recordings. However, as a support in the same way the video recordings are available, why not? If, in the future, AI will write accurate "body cam" based reports I would not have any problems with it as long as the video is still available to be checked. A full report should, in my opinion, always contain additional contextual info from the police involved and witnesses to add what the camera recordings not necessarily reflect or contain.

show 1 reply
wyldfiretoday at 6:01 AM

> important first step in reigning in AI police reports.

That should be 'reining in'. "Reign" is -- ironically - - what monarchs do.

show 1 reply
0x_rstoday at 8:25 AM

I recommend taking a look at this video to get an idea behind the through process (or lack thereof) law enforcement might display when provided with a number of "AI" tools, and even if this one example is closer to traditional face recognition than LLMs, the behavior seems the same. Spoiler: complete submission and deference, and in this specific case to a system that was not even their own.

https://www.youtube.com/watch?v=B9M4F_U1eEw

avidiaxtoday at 6:12 AM

This does sound problematic, but if a police officer's report contradicts the body-worn camera or other evidence, it already undermines their credibility, whether they blame AI or not. My impression is that police don't usually face repercussions for inaccuracies or outright lying in court.

> That means that if an officer is caught lying on the stand – as shown by a contradiction between their courtroom testimony and their earlier police report

The bigger issue, that the article doesn't cover, is that police officers may not carefully review the AI generated report, and then when appearing in court months or years later, will testify to whatever is in the report, accurate or not. So the issue is that the officer doesn't contradict inaccuracies in the report.

show 1 reply
bushbabatoday at 9:53 AM

To me it’s a question of it they are on average better. It’s not like human based input is perfect either.

intendedtoday at 7:13 AM

> In July of this year, EFF published a two-part report on how Axon designed Draft One to defy transparency. Police upload their body-worn camera’s audio into the system, the system generates a report that the officer is expected to edit, and then the officer exports the report. But when they do that, Draft One erases the initial draft, and with it any evidence of what portions of the report were written by AI and what portions were written by an officer. That means that if an officer is caught lying on the stand – as shown by a contradiction between their courtroom testimony and their earlier police report – they could point to the contradictory parts of their report and say, “the AI wrote that.” Draft One is designed to make it hard to disprove that.

> Axon’s senior principal product manager for generative AI is asked (at the 49:47 mark) whether or not it’s possible to see after-the-fact which parts of the report were suggested by the AI and which were edited by the officer. His response (bold and definition of RMS added):

“So we don’t store the original draft and that’s by design and that’s really because the last thing we want to do is create more disclosure headaches for our customers and our attorney’s offices.

Policing and Hallucinations. Can’t wait to see this replicated globally.

show 1 reply
benatkintoday at 7:22 AM

The experiments of AI agents sending emails to grown-ups are good I think – AIs are doing much more dangerous stuff like these AI Police Reports. I don't think making a fuss over every agent-sent email is going to cause other AI incursion into our society to slow down. The Police Report writer is a non-human partially autonomous participant like a K9 officer. It's wishful thinking that AIs aren't going to be set loose doing jobs. The cat is out of the bag.

show 1 reply
maximgeorgetoday at 11:26 AM

[dead]

throw-12-16today at 5:51 AM

“Fighting back” = adding a disclaimer.

You guys are so fucked.

show 1 reply