> According to the court documents, the Fargo detective working the case then looked at Lipps' social media accounts and Tennessee driver's license photo. In his charging document, the detective wrote that Lipps appeared to be the suspect based on facial features, body type and hairstyle and color.
> Once they were in hand, Fargo police met with him and Lipps at the Cass County jail on Dec. 19. She had already been in jail for more than five months. It was the first time police interviewed her.
How is this the fault of AI? It flagged a possible match. A live human detective confirmed it. And the criminal justice system, for reasons that have nothing to do with AI, let this woman sit in jail for 5 months before doing even interviewing her or doing any due diligence.
There's a reason why we don't let AI autonomously jail people. Instead of scapegoating an AI bogeyman, maybe we should look instead at the professional human-in-the-loop who shirked all responsibility, and a criminal justice system that thinks it is okay to jail people for 5 months before even starting to assess their guilt.
There's no way this isn't a slam dunk case to sue the piss out of the Fargo Police, probably the US Marshals and maybe other orgs. The woman in the surveillance phone clearly looks way younger, among the many other obvious signs this woman didn't do it. I hope she wrings at least several million dollars out of the government.
> facial recognition showed she was the main suspect in what Fargo police called an organized bank fraud case.
> Her bank records showed she was more than 1,200 miles away, at home in Tennessee at the same time police claimed she was in Fargo committing fraud.
> Unable to pay her bills from jail, she lost her home, her car and even her dog
It is an AI error, but also an error on the part of the cops, the prosecutors, the judge, and the county sheriff (who is responsible for the jail inmates). I hope everyone involved in this travesty is sued into oblivion and unable to hide behind their immunity defenses. Facial recognition should never be the sole basis for a warrant.
Wow, so many failures of the legal system. While the incompetent/malicious/lazy investigators that used the facial recognition and only that are obviously at major fault, I'd actually put larger blame on the judge that signed the arrest warrant. They are supposed to be a check on such incompetent/malicious/lazy-ness not just a rubber stamp. Unfortunately there's really no recourse against incompetent/malicious/lazy judges.
Of course this would have been bad enough if this had happened where she lived but the holding for 5 months adds a whole 'nother level of insight into brokeness of the legal system. I'd be interested in hearing more about why that happened. Was it just a matter of that happens sometimes if you have a public defender?
This reminds me of the British Post Office Scandal: https://en.wikipedia.org/wiki/British_Post_Office_scandal
lazy stupid pigs should be accountable for misusing AI like this and calling people into a system like that based on some AI's whim and a facebook peek, but having done no actual investigative work.
Lets see the pig that called for her arrest and wasted 4 months of her life spend 4 months in jail.
John Bryant, aka The Civil Rights Lawyer, recently did a piece about a similar case of mistaken identity. The consequences weren't as severe, but the willingness to trust the AI over any other evidence was the same:
https://thecivilrightslawyer.com/2026/03/11/ai-software-tell...
In the video, it shows a police officer blindly trusting a casino's AI software, even when a cursory investigation should have given any reasonable person enough of a reason to question whether the man he arrested was the same man accused of a crime. (And then even after it was confirmed he was not, the prosecutor continued to charge him for trespassing!)
Me: Whoa, cool, my hometown is on atop Hacker News!
Also me, reading further: Uh-oh.
The chief of police also resigned today; wouldn't be shocked if this was part of the reasoning.
I really, really need folks to understand that deflecting blame away from the tool and trying to hold the human accountable feeds right into the marketing playbook of these companies in the first place.
The cops cannot be held accountable because the laws basically give them immunity. The politicians cannot be held accountable beyond being tossed out at the next election, because the laws otherwise give them immunity. The people operating the system cannot be held accountable, because the systems are marketed as authoritative despite being black boxes and lacking in transparency; they trusted the system just as they were told to, and thus cannot be held accountable.
And so when every human in the chain cannot be held accountable for these things, and the law prevents victims from receiving apologies, let alone recourse, then the tool and its maker is the only thing we can hold accountable. By deflecting blame away from the tools ("it wasn't AI, it was facial recognition"; "the human had to sign off on it"; "humans made the arrest, not machines"), you're protecting quite literally the only possible entity that could still potentially be held accountable: the dipshits making these stupid things and marketing them as superior and authoritative when compared to humans.
You want accountability? Start holding capital to account, and this shit falls away real fucking fast. Don't get lost in technical nuance over very real human issues.
“Computers don’t argue” seemed charmingly wrong about how computers work until a few short years ago.
https://nob.cs.ucdavis.edu/classes/ecs153-2019-04/readings/c...
>Unable to pay her bills from jail, she lost her home, her car and even her dog. Fargo police say the bank fraud case is still under investigation and no arrests have been made.
I smell a lawsuit
They do not care.
End qualified immunity and see how fast cops start to do their jobs with care.
Winning a lawsuit literally ends in your own community members (not the cops) paying the bill.
This problem predates modern AI. https://en.wikipedia.org/wiki/Computer_says_no is built upon the deliberate abdication of responsibility to processes that cannot be held accountable. AI is just letting them do it at scale.
That doesn't mean we should accept it from AI. We should fight the blind yielding to the facade of authority regardless of whether the decision was made by an AI or an insect landing on a teleprinter at the wrong time.
Just reading the headline I said to myself: bet this is in America.
Every time I see something like this I can never quite believe this sort of stuff happens. Complete, life ruining incompetence, with no consequences for the idiots that caused this to happen. Ignoring the AI input, which to me has nothing to do with this (it was used as a tool to identify a potential suspect), this woman went to jail for 5 months on the opinion of someone with no other evidence. Only in America.
I wish we saw more invocations of speedy trial rights. Trials MUST begin for felony charges in ND within 90 days of a defendant invoking those rights (must be invoked within 14 days of arraignment)[0].
There's an opportunity for an "AI" app here. Takes your photo, compares with mugshots on police databases, quotes you for requisite cosmetic surgery.
/i
Gofundme? This woman needs some $$ and a lawyer. She may not know it yet, but if she makes some smart moves, she's about to be rich and Fargo is about to learn a very hard lesson.
Something big is missing from this story. How did face ID in ND pick up a matching little old grandma in TN that a TN judge would hold her without bail for 5 months?
Yeah, there is a whole lot more to this story.
It’s obvious from the one photo they posted of the actual suspect that the lady they arrested is about 20-30 years older than the woman in the bank photo. The woman in the photo is maybe 25-30 years old, this grandma looks like she’s 65-70 (actual age of 50).
Absolutely ridiculous, I hope she wins her civil case.
She will be enjoying a tidy compensation pay out. And the number better have seven numbers in it.
Facial recognition? looks at photo I've probably seen a dozen different people who look exactly like this woman just this week.
AI or not, it's unconscionable that victims of compulsory legal processes by way of mistaken identity are not made whole.
Even in Idiocracy they didn't have this problem
This is a badly written story. It should explain if she saw a judge or had a lawyer.
I read the article and I don’t really understand… she was held in a jail in Tennessee but the article states they flew her to North Dakota? And somehow she’s a fugitive so that’s why she doesn’t get bail? but she’s a fugitive held in her own state in a holding facility? But then when they release her, she’s in North Dakota? So if some state says you’re a fugitive your home state will just hold you in jail until they come and put you on an airplane? Is that correct?
Wait - what was the AI tool and how did it have her face to begin with? If small-town police are doing face-matching searches across national databases then nobody is safe because the number of false positives is going to be MASSIVE by sheer number of people being searched every day.
Pretend the tool is 99.999999% specific. If it searches every face in the USA you're still getting about 3 false positives PER SEARCH.
You will never have a criminal AI tool safe enough to apply at a national scale.
Probable cause? What's that?
Judge/magistrate who signed off on the arrest warrant fucked up.
It's not an AI error. It's a human error in mis-using AI in this way. Saying it's an AI error is like saying a hole in your drywall is a hammer error.
Unfortunately we'll probably see a trend of people using AI and then blaming AI for cases where they mis-used AI in roles it's not good for or failed to review or monitor the AI.
How many more articles are we going to see with the headline AI facial recognition leads to innocent person jailed? A grandmother no less.
Some tech company illegally scanned people's photos on social media and now is using them with our complicit legal system to randomly put people behind bars. Now I need to worry that any day now due to a dice roll I will be sent away in a the middle of f'ing nowhere for months or years. Now the government wants to use these same dumb systems to make automated killing machines. FML!
I see a lot of comments trying to attribute blame to the cops, the lawyers, the police chief, the marshals, the tech bros, etc, but it is all of them and all of us that are guilty. We are so complicit in this sick system we live in. We are stuck in a collective action deadlock.
That fear you have in the back of your mind that says next time it might be you is counteracted by the thought "well thank goodness it wasn't me or a loved one," so you don't act. We are all doing this, that is why nothing changes.
The only people able to act these days are the most insane. The narcissistic corrupt power hungry politician, the psychopathic tech bro billionaire, and the jacobins are the only ones with the energy wade through this cesspool and that is why everything is so dystopian.
This is exactly what I would expect from the great state of ND.
https://archive.is/yCaVV - Archive link to get around the paywall.
https://www.theguardian.com/us-news/2026/mar/12/tennessee-gr... - Another article on this without a paywall.
It's annoying that both articles are calling this AI error. This was human error, the police did the wrong thing and the people of Fargo will end up paying for this fuckup.
I live in Fargo. The police chief announced his retirement yesterday. Done by the end of the month. And then today this article comes out. So now we pretty much know why the sudden retirement announcement.
We are rapidly becoming a world where every person is one inscrutable LLM decision from having their life ruined with no recourse.
This type of incident isn't new and is only going to get worse. The problem is our governments are doing absolutely nothing about it. I'll give two examples:
1. Hertz implemented a system where they falsely reported cars as being stolen. People were arrested and went to jail for rental cars that were sitting in the Hertz lot. Hertz ultimately had to pay $168 million in a settlement [1]. That's insufficient. If I, as an ordinary citizen, make a false police report that somebody stole my car I can be criminally charged. And rightly so. People should go to jail for this and it will continue until they do. These fines and settlements are just the cost of doing business; and
2. The UK government contracted Fujitsu to produce a new system for their post offices. That system was allowed to produce criminal charges for fraud that were completely false. People committed suicide over this. This went on for what? A decade or more? But resuted in a parliamentary inquiry and settlements. It's known as the British Post Office scandal [2]. Again, people should go to jail for this.
The choice we as a society face is whether to have automation improve all of our lives by raising everyone's standard of living and allowing us to do less work and less menial work or do we allow automation to further suppress wages so the Epstein class can be slightly more wealthy.
[1]: https://www.npr.org/2022/12/06/1140998674/hertz-false-accusa...
[2]: https://en.wikipedia.org/wiki/British_Post_Office_scandal
What’s remarkable to me, beyond the total incompetence and stupidity of all the police people involved, is how incredibly aggressive the intervention was.
This is bank fraud case, for god’s sake, not an armed robbery. I don’t know the scale of it, but still, no one said she was a danger to anyone. She was a suspect, not a convict, and she was held at gunpoint while babysitting young children. What in the fucking world?
The US is so fucked up lately. People should chill the fuck out.
Completely infuriating, but more of a commentary on the sad state of incompetent power-hungry law enforcement with tools they don't know how to use than the tools themselves.
Though, the question remains: are the tools built in such a way as to deceive the user into a false sense of trust or certainty?
_Some_ of the blame lies on the UX here. It must.
America’s repulsive classism at work to be indifferent to her rights like this
Theres alot of talk about how the cops just misused the tool and its their fault not the AIs
Thats missing the point here. The point is that these tools provide crazy leverage, and thay can be good or bad. If used carefully it can definitely catch criminals faster, but when misused (or abused) they can let the authorities unjustly ruin lives faster
The question isnt whether AI is perfect or not. Its whether you trust the authorities with it. To use and abuse as they can. Think about the average cop. Think about the way Trump treats people. Think about the way israel keeps an ongoing genocide going. Think about the cases of police brutality that happen in the US, the cases of racial profiling. Think about ICE and their behavior, going around kidnapping and killing people. Do you want these people to have more leverage?
[dead]
I hate this headline (not blaming submitter). Police incompetence and negligence jailed her for months and left her stranded in a North Dakota winter. The AI is no more responsible than the cars and airplanes they used.
Edit: this is in reference to the original headline "AI error jails innocent grandmother for months in North Dakota fraud case" not the revised title that it was changed to.
[flagged]
I posted this 9 hours ago. Can I get the karma transferred to my account?
[dead]
Why the fuck does a newspaper need a ‘notifications’ icon in the top right hand corner?
https://archive.ph/2026.03.12-183903/https://www.grandforksh...