Google's SHA-1 collision took 2^63.1 hash operations to find. Given that a single hash operation takes more than 1000 cycles, that's only less than three doublings away.
Cryptographers worry about big numbers. 2^80 is not considered secure.
It's early so I'm thinking out loud here but I don't think the algorithm scales like this, does it?
We're talking about something that can search a list of size N in sqrt(N) iterations. Splitting the problem in two doesn't halve the compute required for each half. If you had to search 100 items on one machine it's taken 10x iterations but split over two it'd take ~7x on each or ~14 in total.
It's early so I'm thinking out loud here but I don't think the algorithm scales like this, does it?
We're talking about something that can search a list of size N in sqrt(N) iterations. Splitting the problem in two doesn't halve the compute required for each half. If you had to search 100 items on one machine it's taken 10x iterations but split over two it'd take ~7x on each or ~14 in total.