logoalt Hacker News

f1shytoday at 4:36 PM2 repliesview on HN

I had an interesting conversation with a guy at work past week. We were discussing some unimportant matter. The guy has a pretty high self esteem, and even if he was discussing, in his own words, “out of belief and guess” and I was telling him, I knew for a fact what I was talking about, I had a hard time because he wouldn’t accept what I was saying. At some point he left, and came back with “Gemini says I’m right! So, no more discussion” I asked what did he exactly asked. He: “I have a colleague who is arguing X, I’m sure is Y. Who is right?!”

Of course he was right! By a long shot. I asked gemini same thing but a very open ended question, and answered basically what I was saying.

LLM are pretty dangerous in confirming you own distorted view of the world.


Replies

bachmeiertoday at 5:17 PM

I agree with your conclusion, but that's by design. The goal is not to tell people the truth (how would they even do that). The goal is to give the answer that would have come from the training data if that question were asked. And the reality is that confirmation is part of life. You may even struggle to stay married if you don't learn to confirm your wife's perspectives.

lstoddtoday at 5:19 PM

It's more like insufficient emotional control is very dangerous. It's nothing new but I guess LLMs highlighted that problem a bit.