logoalt Hacker News

empath75yesterday at 4:09 PM1 replyview on HN

> I have to disagree here. When you are tasked with dividing 2 big numbers you most certainly don't "autocomplete" (with the sense of finding the most probable next tokens, which is what an LLM does), rather you go through set of steps you have learned.

Why do you think that this is the part that requires intelligence, rather than a more intuitive process? Because they have had machines that can do this mechanically for well over a hundred years.

There is a whole category of critiques of AI of this type: "Humans don't think this way, they mechanically follow an algorithm/logic", but computers have been able to mechanically follow algorithms and perform logic from the beginning! That isn't thinking!


Replies

cpt_sobeltoday at 7:47 AM

Good points - mechanically just following algorithms isn't thinking, and neither is "predicting the next tokens".

But would a combination of the 2 then be close to what we define as thinking though?