logoalt Hacker News

nemomarxyesterday at 7:51 PM3 repliesview on HN

I'm sure you could get an LLM to create a plausible sounding justification for every decision? It might not be related to the real reason, but coming up with text isn't the hard part there surely


Replies

zugiyesterday at 9:18 PM

> I'm sure you could get an LLM to create a plausible sounding justification for every decision.

That's a great point: funny, sad, and true.

My AI class predated LLMs. The implicit assumption was that the explanation had to be correct and verifiable, which may not be achievable with LLMs.

show 1 reply
nullcyesterday at 9:17 PM

Yes, they will, they'll rationalize whatever. This is most obvious w/ transcript editing where you make the LLM 'say' things it wouldn't say and then ask it why.

SpaceNoodledyesterday at 8:40 PM

It sounds like you're saying we should generate more bullshit to justify bullshit.

show 1 reply