logoalt Hacker News

tsoukaseyesterday at 9:36 PM2 repliesview on HN

If an LLM hallucinates in 1% of occasions and gives subpar output in 5%, this kills his effectiveness to replace anyone. Imagine a support guy on the other side of the phone to speak gibberish 10 times a day. Now, imagine a doctor. These will never lose their jobs.


Replies

Bratmonyesterday at 9:37 PM

> Imagine a support guy on the other side of the phone to speak gibberish 10 times a day.

A massive improvement?

simianwordsyesterday at 9:40 PM

but llm's dont speak gibberish 10 times a day even now. from my usage, chatgpt has not said one obviously strange thing since o3 came out.

show 1 reply