I know it won't be in the holistic or deep philosophical way but even by just predicting the next token without any real sense of the real world LLMs are already capable of simulating basic reasoning which a lot of people lacks, I mean even the 7B llama2 model can tell you the Earth ain't flat... go figure.
I know it won't be in the holistic or deep philosophical way but even by just predicting the next token without any real sense of the real world LLMs are already capable of simulating basic reasoning which a lot of people lacks, I mean even the 7B llama2 model can tell you the Earth ain't flat... go figure.