logoalt Hacker News

jumploopsyesterday at 6:58 PM5 repliesview on HN

> “a new knowledge cutoff of August 2025”

This (and the price increase) points to a new pretrained model under-the-hood.

GPT-5.1, in contrast, was allegedly using the same pretraining as GPT-4o.


Replies

redox99today at 2:04 AM

I think it's more likely to be the old base model checkpoint further trained on additional data.

show 1 reply
FergusArgyllyesterday at 8:37 PM

A new pretrain would definitely get more than a .1 version bump & would get a whole lot more hype I'd think. They're expensive to do!

show 8 replies
98Windowsyesterday at 8:24 PM

or maybe 5.1 was an older checkpoint and has more quantization

MagicMoonlightyesterday at 9:07 PM

No, they just feed in another round of slop to the same model.