I think the key difference here is that if you type into Google the wrong thing it will return poor results that make it fairly clear that you're not on the right track
LLMs will sometimes just invent something that basically gaslights you into thinking you're on the right track
I think the key difference here is that if you type into Google the wrong thing it will return poor results that make it fairly clear that you're not on the right track
LLMs will sometimes just invent something that basically gaslights you into thinking you're on the right track