logoalt Hacker News

falcor84today at 2:40 PM5 repliesview on HN

Isn't the fact that a person is asking an AI whether to leave their partner in its own an indication that they should?

EDIT: typo


Replies

oldfrenchfriestoday at 3:04 PM

The idea that asking implies a yes is actually a pretty common logical fallacy. In relationship science, we call this "Relational Ambivalence" and its a completely normal part of any longterm commitment.

nomorewordstoday at 2:47 PM

How is it an indication? I think people on here don't realize that most of the people don't think things through as much as (software) engineers

show 3 replies
duskdozertoday at 2:58 PM

>asking an AI whether to leave your partner

is that what they're asking though? because "relationship advice" is pretty vague

show 1 reply
dinkumthinkumtoday at 4:06 PM

No, but it is an indication of brain-rot to make a question seriously and also to think that it means the conclusion is foregone. It is an advent of our childlike current generations. Of course, the moment anything becomes difficult or unpleasant, one should quit, apparently. Surely, this kind of resiliency is what got humanity so far.

show 1 reply