logoalt Hacker News

tadzikpktoday at 2:55 AM2 repliesview on HN

On page 13 you'll see _why_ the judges don't apply the letter of the law - they're seeking to do justice to the victims _in spite of_ the law.

"there is another possible explanation: the human judges seek to do justice. The materials include a gruesome description of the injuries the plaintiff sustained in the automobile accident. The court in the earlier proceeding found that she was entitled to [details] a total of $750,000.10. It then noted that she would be entitled to that full amount under Nebraska law but only $250,000 under Kansas law." So the judge's decision "reflects a moral view that victims should be fully compensated ... This bias is reflected in Klerman and Spamann’s data: only 31% of judges applied the cap (i.e., chose Kansas law), compared to the expected 46% if judges were purely following the law." "By contrast, GPT applied the cap precisely"

Far from making the case for AI as a judge, this paper highlights what happens when AI systematically applies (often harsh) laws vs the empathy of experienced human judgement.


Replies

DrewADesigntoday at 3:07 AM

So many “AI is going to replace expert ______” assertions come from computer scientists not realizing how little they understand the real world requirements of those roles. Judges are at the intersection of humanity and policy: they are there to use their judgement, not merely parse the words and do the math. A judge probably wouldn’t have even done that part — their clerk would have. Is it cool and likely useful? Sure. Is it going to ‘outperform judges’ at their core competencies? Hell no.

show 1 reply
SpaceManNabstoday at 4:01 AM

As damning as these comments are, this comment kinda scared because it reminds me of the times when judges decide against applying empathy against society's most marginalized.

Hopefully as these models get better, we get to a place where judges are pressured to apply empathy more justly.