logoalt Hacker News

tony_cannistrayesterday at 9:11 PM4 repliesview on HN

Completely infuriating, but more of a commentary on the sad state of incompetent power-hungry law enforcement with tools they don't know how to use than the tools themselves.

Though, the question remains: are the tools built in such a way as to deceive the user into a false sense of trust or certainty?

_Some_ of the blame lies on the UX here. It must.


Replies

sidrag22yesterday at 9:24 PM

It must land as human's fault or this will become more and more of a pattern to avoid accountability.

show 1 reply
ImPostingOnHNyesterday at 9:27 PM

> are the tools built in such a way as to deceive the user into a false sense of trust or certainty? _Some_ of the blame lies on the UX here. It must.

Are AI code assist tools built in such a way as to deceive the user into a false sense of trust or certainty? Very much so (even if that isn't a primary objective).

Does any part of the blame lie on the UX if a dev submits a bad change? No, none.

You are ultimately, solely responsible for your work output, regardless of which tool you choose to use. If using your tool wrong means you make someone homeless, car-less, and also you kill their dog, then you should be a lot more cautious and perform a lot more verification than the average senior engineer.

show 1 reply
throw_m239339yesterday at 9:29 PM

> they don't know how to use than the tools themselves.

No, the tools work perfectly as they were design to work. The problem is that the tools are flawed.

Ultimately, every single of these decisions should be approved by a human, which should be responsible for the fuck up no matter what the consequences are.

> _Some_ of the blame lies on the UX here. It must.

No, the blame lies with the person or the group who approve the usage of these tools, without understanding their shortcomings.

show 2 replies
hsbauauvhabzbyesterday at 9:27 PM

Spoken like someone who isn’t built for a sales role at said company.

Sales will sell the dream, who cares if the real world outcomes don’t align?