So we should believe the hallucinations because they sound like something that could be true? Does the LLM in the middle somehow makes it more trustworthy than if GP had just shared their own pattern-matching conjecture?
No. I think LLMs are garbage. Separately, and unrelated: I think Facebook is behind these bills. The LLM may be garbage and still sometimes produce a correct result.
No. I think LLMs are garbage. Separately, and unrelated: I think Facebook is behind these bills. The LLM may be garbage and still sometimes produce a correct result.