logoalt Hacker News

schnitzelstoattoday at 12:06 PM1 replyview on HN

It's quite rare that it gives a wrong answer nowadays. Even more so if you ask it to use the internet etc.

But yeah, it's not infallible and sometimes even when it gives you a source it will incorrectly summarise it, but you can double check the information in the source itself.

It just makes it a lot easier to do quickly rather than having to go and find the right Wikipedia article or dig through lots of documentation. Just like Wikipedia and online docs made it easier than having to go to the library or leaf through a 500-page manual etc.


Replies

Gigachadtoday at 12:24 PM

Only if you are asking surface level questions. There are also certain topics that seem to be worse than others. For asking about how to do things in software guis modern LLMs seem to have a high rate of making up features or paths to reach them. For asking advice in games I've seen an extremely high rate of hallucinations. Asking why something is broken in my codebase has about a 95% hallucination rate.

If you are just asking basic science questions or phone reviews then its pretty reliable.