logoalt Hacker News

tomgagyesterday at 7:57 PM5 repliesview on HN

I guess I'll post it here as well. This is my personal take on the whole story: https://gagliardoni.net/#20250714_ludd_grandpas

A relevant quote: "this is your daily reminder that "How large is the biggest number it can factorize" is NOT a good measure of progress in quantum computing. If you're still stuck in this mindset, you'll be up for a rude awakening."

Related: this is from Dan Bernstein: https://blog.cr.yp.to/20250118-flight.html#moon

A relevant quote: "Humans faced with disaster tend to optimistically imagine ways that the disaster will be avoided. Given the reality of more and more user data being encrypted with RSA and ECC, the world will be a better place if every effort to build a quantum computer runs into some insurmountable physical obstacle"


Replies

kevinventulloyesterday at 8:58 PM

A better measure of progress (valid for cryptanalysis, which is, anyway, a very minor aspect of why QC are interesting IMHO) would be: how far are we from fully error-corrected and interconnected qubits? I don't know the answer, or at least I don't want to give estimates here. But I know that in the last 10 or more years, all objective indicators in progress that point to that cliff have been steadily improving: qubit fidelity, error rate, coherence time, interconnections... At this point I don't think it's wise to keep thrashing the field of quantum security as "academic paper churning".

I think the problem is that “objective indicators pointing to the cliff” is pretty handwavy. Could there be a widely agreed-upon function of qubit fidelity, error rate, coherence time, and interconnections that measures, even coarsely, how far we are from the cliff? It seems like the cliff has been ten years away for a very long time, so you might forgive an outsider for believing there has been a lot of motion without progress.

ethan_smithtoday at 12:35 AM

Shor's algorithm still requires O(log(N)³) qubits and O(log(N)²log(log(N))log(log(log(N)))) operations to factor N, which is why these satirical "records" highlight the absurdity of focusing solely on factorization milestones rather than addressing the exponential scaling challenges.

show 1 reply
adgjlsfhk1today at 3:17 AM

The thing that still feels off to me is that you should be able to run 8 bit Shors algorithm without error correction, right? Sure we don't have reliable error corrected q-bits, but being able to factor a number that small should be possible (even if it had a fairly high error rate) with current computers. Sure it won't be 100% reliable, but if we had published results that in 2010 it got the right answer 10% of the time, and in 2025 it gets the answer write 25% of the time, that would at least be a measure of progress.

jgeadayesterday at 8:06 PM

Except that factorization is exactly what is needed to break encryption, and so knowing what QC can do in that realm of mathematics and computing is exactly the critical question that needs to be asked.

And a reminder that in the world of non-QC computing, right from its very roots, the ability of computers improved in mind boggling large steps every year.

QC records, other than the odd statistic about how many bits they can make, have largely not made any strides in being able to solve real world sized problems (with exception of those that use QCs purely as an analog computer to model QC behavior)

show 1 reply
theuirvhhjj588today at 12:14 AM

That's a cop out.

I agree with what you're saying, but what you're also essentially saying is that QCs are so useless at the moment that the granularity of integers is not enough to measure progress on the hardware.