When someone says "AIs aren't really thinking" because AIs don't think like people do, what I hear is "Airplanes aren't really flying" because airplanes don't fly like birds do.
If I shake some dice in a cup are they thinking about what number they’ll reveal when I throw them?
This really shows how imprecise a term 'thinking' is here. In this sense any predictive probabilistic blackbox model could be termed 'thinking'. Particularly when juxtaposed against something as concrete as flight that we have modelled extremely accurately.