logoalt Hacker News

voidsparktoday at 4:35 AM2 repliesview on HN

> inability to self-reflect and recognize they have to ask for more details because their priors are too low.

Gemini 2.5 Pro and ChatGPT-o3 have often asked me to provide additional details before doing a requested task. Gemini sometimes comes up with multiple options and requests my input before doing the task.


Replies

Workaccount2today at 2:26 PM

Gemini is also the first model I have seen call me out in it's thinking. Stuff like "The user suggested we take approach ABC, but I don't think the user fully understands ABC, I will suggest XYZ as an alternative since it would be a better fit"

rrr_oh_mantoday at 4:46 AM

That's a recent development for (imho) higher engagement and reduced compute.

show 1 reply