The issue is that prediction is "part" of the human thought process, it's not the full story...
> The issue is that prediction is "part" of the human thought process, it's not the full story...
Do you have a proof for this?
Surely such a profound claim about human thought process must have a solid proof somewhere? Otherwise who's to say all of human thought process is not just a derivative of "predicting the next thing"?
And the big players have built a bunch of workflows which embed many other elements besides just "predictions" into their AI product. Things like web search, to incorporating feedback from code testing, to feeding outputs back into future iterations. Who is to say that one or more of these additions has pushed the ensemble across the threshold and into "real actual thinking."
The near-religious fervor which people insist that "its just prediction" makes me want to respond with some religious allusions of my own:
> Who is this that wrappeth up sentences in unskillful words? Gird up thy loins like a man: I will ask thee, and answer thou me. Where wast thou when I laid up the foundations of the earth? tell me if thou hast understanding. Who hath laid the measures thereof, if thou knowest? or who hath stretched the line upon it?
The point is that (as far as I know) we simply don't know the necessary or sufficient conditions for "thinking" in the first place, let alone "human thinking." Eventually we will most likely arrive at a scientific consensus, but as of right now we don't have the terms nailed down well enough to claim the kind of certainty I see from AI-detractors.