This seems to be a feature most chatbots have copied from each other. I've found that OpenAI's implementation of suggestions rarely results in something useful.
"Do you want me to find actual eBay links for an X?"
"Yes"
"Okay, on eBay you can find links by searching for..."
It does work if I'm guiding it, but the suggested next action is sort of useful. The funniest version of this was when I uploaded a PDF of Kessler 1995 on PTSD just to talk through some other search items and Gemini suggested the following ridiculous confluence of memory (from other chats clearly) and suggestion:
> Since you mentioned being interested in the ZFS file system and software consulting, would you be interested in seeing how the researchers used Kaplan-Meier survival analysis to map out the "decay" of PTSD symptoms over time?
Top notch suggestion, mate. Really appreciate the explanation there as well.
It is interesting how seldom it comes up how manipulative these agents are. Hopefully that discussion grows.