Bits are a bit misleading here, it would be more accurate to say "units of computation". If you're problem space operates on 32-bit integers this would be 32bits * number of steps, these papers solve for individual bits as the smallest individual unit of computation we can commonly reason about.
Wait but if "bits of memory" here was supposed to be "units of computation", that means that:
* The old assumption was that if you require t steps to complete, you need t/log(t) units of computation
* This new proof shows that if you require t steps to complete, you need sqrt(t) units of computation
Surely this doesn't make sense? Using any definition of "unit of computation" I would intuitively assume, computing t steps requires something proportionalt to t units of computation...