logoalt Hacker News

regentbowerbird01/21/20254 repliesview on HN

> compared this to researching on the internet, there are some good aspects, but more often than not, I end up reading an opinionated post by someone (no matter the topic, if you go deep enough, you will land on an opinionated factual telling).

ChatGPT is in fact opinionated, it has numerous political positions ("biases") and holds some subjects taboo. The difference is that a single actor chooses the political opinions of the model that goes on to interact with many more people than a single opinion piece might.


Replies

lazybreather01/21/2025

Political searches I assume would be very very minor percentage of real learning. Even in such cases, I would rather rely on a good LLMs response than scrounging websites of mainstream media or blogs etc. For an objective response, reading through opinionated articles and forming my opinion is an absolute waste of time. I'd want the truth as accurately as possible. Plus people don't generally change political opinions based what they read. They read stuff aligning with their side.

show 1 reply
ankit21901/21/2025

Yes that is true. Though that can be subsumed if you notice it, and ask the model to ignore those biases. (an extreme example would be opposition prep for a debate). I am not interested in politics and other related issues anyway.

sanderjd01/21/2025

Fine. But it would never occur to me to try to form political opinions using chatgpt.

show 1 reply