You are assuming that progress on factoring will be smooth, but this is unlikely to be true. The scaling challenges of quantum computers are very front-loaded. I know this sounds crazy, but there is a sense in which the step from 15 to 21 is larger than the step from 21 to 1522605027922533360535618378132637429718068114961380688657908494580122963258952897654000350692006139 (the RSA100 challenge number).
Consider the neutral atom proposal from TFA. They say they need tens of thousands of qubits to attack 256 bit keys. Existing machines have demonstrated six thousand atom qubits [1]. Since the size is ~halfway there, why haven't the existing machines broken 128 bit keys yet? Basically: because they need to improve gate fidelity and do system integration to combine together various pieces that have so far only been demonstrated separately and solve some other problems. These dense block codes have minimum sizes and minimum qubit qualities you must satisfy in order for the code to function. In that kind of situation, gradual improvement can take you surprisingly suddenly from "the dense code isn't working yet so I can't factor 21" to "the dense code is working great now, so I can factor RSA100". Probably things won't play out quite like that... but if your job is to be prepared for quantum attacks then you really need to worry about those kinds of scenarios.