logoalt Hacker News

bensyversontoday at 4:18 PM1 replyview on HN

"Bullshit" is a human concept. LLMs do not work like the human brain, so to call their output "bullshit" is ascribing malice and intent that is simply not there. LLMs do not "think." But that does not mean they're not incredibly powerful and helpful in the right context.


Replies

slopinthebagtoday at 5:06 PM

I sort of agree. In this context "bullshit" means "speech intended to persuade without regard for truth", and while it's true that LLM output is without regard for truth, it's not an entity capable of the agency to persuade, although functionally that is what it can appear like.

https://en.wikipedia.org/wiki/On_Bullshit