Author here: yes that's all correct.
This is perhaps not clear enough, but the title refers to a pattern. For classical bits on a quantum computer this pattern is already playing out (as shown in the cited experiments), and for quantum bits I think it's about to play out.
Classical storage of classical bits is still far more reliable, of course. Hell, a rock chucked into one bucket or another is still more reliable. We'll never beat the classical computer at storing classical bits... but the rock in a bucket has some harsh competition coming.
I should maybe also mention that arbitrarily good qubits are a step on the road, not the end. I've seen a few twitter takes making that incorrect extrapolation. We'll still need hundreds of these logical qubits. It's conceivable that quantity also jumps suddenly... but that'd require even more complex block codes to start working (not just surface codes). I'm way less sure if that will happen in the next five years.
I don’t really expect fancier codes to cause a huge jump in the number of logical qubits. At the end of the day, there’s some code rate (logical bits / physical bits) that makes a quantum computer work. The “FOOM” is the transition from that code rate changing from zero (lifetime of a logical bit is short) to something that is distinctly different from zero (the state lasts long enough to be useful when some credible code). Say the code rate is 0.001 when this happens. (I haven’t been in the field for a little while, but I’d expect higher because those huuuuge codes have huuuuge syndromes, which isn’t so fun. But if true topological QC ever works, it will be a different story.) The code rate is unlikely to ever be higher than 1/7 or so, and it will definitely not exceed 1. So there’s at most a factor of 1000, and probably less, to be gained by improving the code rate. This isn’t an exponential or super-exponential FOOM.
A factor of 1000 may well be the difference between destroying Shor’s-algorithm-prone cryptography and destroying it later, though.