logoalt Hacker News

jgeadalast Friday at 8:42 PM3 repliesview on HN

There is a reason QC factorization records haven't shifted much over the past years. Number of qubits by themselves isn't enough. You to be able to do computation on them and for long enough to run Shor's algorithm till it produces a solution. How the qubits are connected, how reliable the logic gates are and how long you can maintain the quantum coherence with enough fidelity to get results is equally important.

That no significant factorization milestones have moved is a huge critical black eye to this field. Even worse, that no one has ever even been able to truly run Schors algorithm on even trivial numbers is a shocking indictment of the whole field.


Replies

tomgaglast Friday at 8:50 PM

The reasons you listed are exactly why the lack of factorization records should not be seen as a "critical black eye to this field", because they are not a relevant measure of progress. Again, think of the parallel with LLMs: it took decades to get out of the "AI winter", because that's what non-linear technological progress looks like.

With QC, the risk (and I am not saying this is going to happen, but I'm saying that it is a non-overlookable risk) is that we end up transitioning from "QC can only factorize 15" to "RSA-2048 is broken" in such a sudden way that the industry has no time to adapt.

show 1 reply
mlylelast Friday at 9:54 PM

I think the main thing is: quantum computing doesn't really work right now, at all.

Imagine if you had crummy, unreliable transistors. You couldn't build any computing machine out of them.

Indeed, in the real world progress looked like:

* Useless devices (1947)

* Very limited devices (hearing aids)

* Hand-selected, lab devices with a few hundred transistors, computing things as stunts (1955)

* The IBM 1401-- practical transitorized computers (1959)-- because devices got reliable enough and ancillary technologies like packaging improved.

In other words, there was a pattern of many years of seemingly negligible progress and then a sudden step once the foundational component reached a critical point. I think that's the point of the person you're talking to about this.

And then just a couple of years later we had the reliability to move to integrated circuits for logic.

If you looked at the "transistorized factorization record" it would be static for several years, before making a couple steps of several orders of magnitude each.

bawolffyesterday at 12:09 AM

> That no significant factorization milestones have moved is a huge critical black eye to this field.

But everyone knew that it wasn't going to move going in. It would have been shocking if it had. It was never a reasonable medium-term goal.