logoalt Hacker News

akkad33yesterday at 7:40 PM3 repliesview on HN

Couldn't this backfire if they put LLMs on safety critical data. Or even if someone asks LLms for medical advice and dies?


Replies

bigstrat2003today at 1:47 AM

You already shouldn't be using LLMs for either of those things. Doing so is tremendously foolish with how stupid and unreliable the models are.

nxpnsvyesterday at 7:42 PM

I guess that the point is that doing so already is not safe?

awkwardyesterday at 7:43 PM

There are several humans who need to make decisions between bad training data and life or death decisions coming from an LLM.