logoalt Hacker News

ptxyesterday at 12:40 PM2 repliesview on HN

As I said, how are you going to check the source when LLMs can't provide sources? The models, as far as I know, don't store links to sources along with each piece of knowledge. At best they can plagiarize a list of references from the same sources as the rest of the text, which will by coincidence be somewhat accurate.


Replies

a1j9o94yesterday at 3:17 PM

Pretty much every major LLM client has web search built in. They aren't just using what's in their weights to generate the answers.

When it gives you a link, it literally takes you to the part of the page that it got its answer from. That's how we can quickly validate.

sejjeyesterday at 3:00 PM

LLMs provide sources every time I ask them.

They do it by going out and searching, not by storing a list of sources in their corpus.

show 1 reply