logoalt Hacker News

a-t-c-gtoday at 3:58 AM0 repliesview on HN

Yes - some degree of reasoning appears to be latent in the structure of language itself. But models trained explicitly on reasoning-focused data still perform better than models trained only on general corpora.*

*At least up to 300B parameters, based on the models we’ve tested.