> “a new knowledge cutoff of August 2025”
This (and the price increase) points to a new pretrained model under-the-hood.
GPT-5.1, in contrast, was allegedly using the same pretraining as GPT-4o.
A new pretrain would definitely get more than a .1 version bump & would get a whole lot more hype I'd think. They're expensive to do!
or maybe 5.1 was an older checkpoint and has more quantization
No, they just feed in another round of slop to the same model.
I think it's more likely to be the old base model checkpoint further trained on additional data.