Here's the gist:
For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).
Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.
Why does it need the same bits of memory as steps?
> Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.
Does it require one to first have solved the problem in uncompressed space?
> Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory
At least or at most or on average?
Huh? What did any of those theorists think about the phrase "time/space tradeoff"?
Doesn’t make practical sense why they would even assign a number to problems which could have unknown dependencies. Unless you are talking about bounded math issues
> log(t)
log to what basis? 2 or e or 10 or...
Why do programmers have to be so sloppy?
To be clear, this was/is a statement about the worst case, not every problem. So it may be clearer to rephrase your sentence as:
For nearly 50 years theorists believed that there exist problems taking t steps that also need roughly t/log(t) bits of memory.
(The Quanta magazine article https://www.quantamagazine.org/for-algorithms-a-little-memor... gives some context: for a long time, the best result was t/log(t) as proved in "On Time Versus Space" https://dl.acm.org/doi/10.1145/322003.322015 by Hopcroft, Paul and Valiant in 1975.)