logoalt Hacker News

voidspark05/15/20252 repliesview on HN

> inability to self-reflect and recognize they have to ask for more details because their priors are too low.

Gemini 2.5 Pro and ChatGPT-o3 have often asked me to provide additional details before doing a requested task. Gemini sometimes comes up with multiple options and requests my input before doing the task.


Replies

Workaccount205/15/2025

Gemini is also the first model I have seen call me out in it's thinking. Stuff like "The user suggested we take approach ABC, but I don't think the user fully understands ABC, I will suggest XYZ as an alternative since it would be a better fit"

show 1 reply
rrr_oh_man05/15/2025

That's a recent development for (imho) higher engagement and reduced compute.

show 1 reply