logoalt Hacker News

alansaberyesterday at 4:43 PM2 repliesview on HN

I think not if only for the fact that the quantity of old data isn't enough to train anywhere near a SoTA model, until we change some fundamentals of LLM architecture


Replies

andyfilms1yesterday at 4:47 PM

I mean, humans didn't need to read billions of books back then to think of quantum mechanics.

show 2 replies
franktankbankyesterday at 4:46 PM

Are you saying it wouldn't be able to converse using english of the time?

show 2 replies