logoalt Hacker News

boscillatoryesterday at 7:39 PM2 repliesview on HN

It will be fascinating to see the facts of this case, but if it is proven their algorithms are discriminatory, even by accident, I hope workday is held accountable. Making sure your AI doesn't violate obvious discrimination laws should be basic engineering practice, and the courts should help remind people of that.


Replies

zugiyesterday at 7:46 PM

An AI class that I took decades ago had just a 1 day session on "AI ethics". Somehow despite being short, it was memorable (or maybe because it was short...)

They said ethics demand that any AI that is going to pass judgment on humans must be able to explain its reasoning. An if-then rule says this, or even a statistical correlation between A and B indicates that would be fine. Fundamental fairness requires that if an automated system denies you a loan, a house, or a job, it be able to explain something you can challenge, fix, or at least understand.

LLMs may be able to provide that, but it would have to be carefully built into the system.

show 3 replies
candiddevmikeyesterday at 7:47 PM

Would love to see some of the liability transfer to the companies using Workday too...