logoalt Hacker News

miroljubyesterday at 12:56 PM1 replyview on HN

> Gemini does the same thing. For every question it looks to extend the conversation into natural follow-up questions, always ending a response with "Would you like to know more about {some important aspect of the answer}?"

If the aspect of the answer is important, wouldn't it be better just not to skip it?

> And...I don't see it as a bad thing. It's trying to encourage use of the tool by reducing the friction to continued conversations, making it an ordinary part of your life by proving that it provides value.

To me, it just adds friction. Why do I have to beg and ask multiple times to get an answer they already know I'm looking for but still decide to withhold? It's neither natural nor helpful. It's manipulative.

> It's similar to Netflix telling you other shows you might like because they want to continue providing value to justify the subscription.

It's not the same, because Netflix doesn't hide important movie sequences from you behind a question "If you like, I can show you this important scene that I just fast forwarded."


Replies

llm_nerdyesterday at 1:20 PM

Groan. This is performative outrage and it's just boorish. The other person noted that ChatGPT uses bait-type continuations (Gemini and Claude do not), and sure that is a problem, but your reply is just noise. Beg? Christ.

There is utterly nothing wrong with AI engines offering continuation questions. But there's always something for people to whine about.

Humans do not want to ask a question and get a book in response. They just don't. No one, including you, wants such a response. And if you did get such a response I absolutely guarantee, given this performative outrage, that you'd be the first to complain about it.

show 1 reply