logoalt Hacker News

Willow, Our Quantum Chip

1355 pointsby robflaherty12/09/2024513 commentsview on HN

Comments

vessenes12/09/2024

I’m a quantum dabbler so I’ll throw out an armchair reaction: this is a significant announcement.

My memory is that 256 bit keys in non quantum resistant algos need something like 2500 qubits or so; and by that I mean generally useful programmable qubits. To show a bit over 100 qubits with stability, meaning the information survives a while, long enough to be read, and general enough to run some benchmarks on is something many people thought might never come.

There’s a sort of religious reaction people have to quantum computing: it breaks so many things that I think a lot of people just like to assume it won’t happen: too much in computing and data security will change -> let’s not worry about it.

Combined with the slow pace of physical research progress (Schorrs algorithm for quantum factoring was mid 90s), and snake oil sales companies, it’s easy to ignore.

Anyway seems like the clock might be ticking; AI and data security will be unalterably different if so. Worth spending a little time doing some long tail strategizing I’d say.

show 16 replies
codeulike12/09/2024

They opened the API for it and I'm sending requests but the response always comes back 300ms before I send the request, is there a way of handling that with try{} predestined{} blocks? Or do I need to use the Bootstrap Paradox library?

show 16 replies
jawns12/09/2024

> It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse

I see the evidence, and I see the conclusion, but there's a lot of ellipses between the evidence and the conclusion.

Do quantum computing folks really think that we are borrowing capacity from other universes for these calculations?

show 19 replies
GilKalai12/10/2024

In the past five years I participated in a project (with Yosi Rinott and Tomer Shoham) to carefully examine the Google's 2019 "supremacy" claim. A short introduction to our work is described here: https://gilkalai.wordpress.com/2024/12/09/the-case-against-g.... We found in that experiment statistically unreasonable predictions (predictions that were "too good to be true") indicating methodological flaws. We also found evidence of undocumented global optimization in the calibration process.

In view of these and other findings my conclusion is that Google Quantum AI’s claims (including published ones) should be approached with caution, particularly those of an extraordinary nature. These claims may stem from significant methodological errors and, as such, may reflect the researchers’ expectations more than objective scientific reality.

show 2 replies
djoldman12/09/2024

I wonder if anyone else will be forced to wait on https://scottaaronson.blog/ to tell us if this is significant.

show 3 replies
fidotron12/09/2024

The slightly mind blowing bit is detailed here: > https://research.google/blog/making-quantum-error-correction...

“the first quantum processor where error-corrected qubits get exponentially better as they get bigger”

Achieving this turns the normal problem of scaling quantum computation upside down.

show 2 replies
readyplayernull12/09/2024

> It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.

Processing in multiverse. Would that mean we are inyecting entropy into those other verses? Could we calculate how many are there from the time it takes to do a given calculation? We need to cool the quantum chip in our universe, how are the (n-1)verses cooling on their end?

show 5 replies
codeyperson12/10/2024

If we are in a simulation. This seems like a good path to getting our process terminated for consuming too much compute.

show 2 replies
machina_ex_deus12/10/2024

You learn a lot by what isn't mentioned. Willow had 101 qubits in the quantum error correction experiment, yet only mere 67 qubits in the random circuit sampling experiment. Why did they not test random circuit sampling with the full set of qubits? Maybe when turning on the full 101 set of qubits, qubits fidelity dropped.

Remember macroscopic objects have 10^23=2^76 particles, so until 76 qubits are reached and exceeded, I remain skeptical that the quantum system actually exploits an exponential Hilbert space, instead of the state being classically encoded by the particles somehow. I bet Google is struggling just at this threshold and they don't announce it.

vhiremath412/09/2024

I really wish the release videos made things a ~tad~ bit less technical. I know quantum computers are still very early so the target audience is technical for this kind of release, but I can’t help wonder how many more people would be excited and pulled in if they made the main release video more approachable.

show 1 reply
DebtDeflation12/09/2024

>the more qubits we use in Willow, the more we reduce errors, and the more quantum the system becomes

That's an EXTRAORDINARY claim and one that contradicts the experience of pretty much all other research and development in quantum error correction over the course of the history of quantum computing.

show 5 replies
correlator12/09/2024

I met julian touring UCSB as perspective grad students. We sat together at dinner and he was really smart, kind, and outgoing. Great to see him presenting this work!

crishoj12/10/2024

Take the announcement with a grain of salt. From German physicist Sabine Hoffenfelder:

> The particular calculation in question is to produce a random distribution. The result of this calculation has no practical use. > > They use this particular problem because it has been formally proven (with some technical caveats) that the calculation is difficult to do on a conventional computer (because it uses a lot of entanglement). That also allows them to say things like "this would have taken a septillion years on a conventional computer" etc. > > It's exactly the same calculation that they did in 2019 on a ca 50 qubit chip. In case you didn't follow that, Google's 2019 quantum supremacy claim was questioned by IBM pretty much as soon as the claim was made and a few years later a group said they did it on a conventional computer in a similar time.

https://x.com/skdh/status/1866352680899104960

show 4 replies
rdtsc12/09/2024

The main part for me is reducing error faster as they scale. This was a major road-block, known as "below threshold”. That's a major achievement.

I am not sure about RCS as the benchmark as not sure how useful that is in practice. It just produced really nice numbers. If I had a few billions of pocket change around, would I buy this to run RCS really fast? -Nah, probably not. I'll get more excited when they factor numbers at a rate that would break public key crypto. For that would spend my pocket change!

show 1 reply
TachyonicBytes12/09/2024

Link to the actual article: https://www.nature.com/articles/s41586-024-08449-y

show 1 reply
hi4112/09/2024

What benchmark is being referred here?

>>Willow performed a standard benchmark computation in under five minutes that would take one of today’s fastest supercomputers 10 septillion (that is, 1025) years — a number that vastly exceeds the age of the Universe

show 1 reply
xnx12/09/2024

Is anyone else even close to Google in this space? (e.g. on the "System Metrics" the blog defines)

show 7 replies
softwaredoug12/09/2024

With this plus the weather model announcement. I’m curious what people think about the meta question on why corporate labs like Google DeepMind etc seem to make more progress on big problems than academia?

There are a lot of critiques about academia. In particular that it’s so grant obsessed you have to stay focused on your next grant all the time. This environment doesnt seem to reward solving big problems but paper production to prove the last grant did something. Yet ostensibly we fund fundamental public research precisely for fundamental changes. The reality seems to be the traditional funding model create incremental progress within existing paradigms.

show 3 replies
johndhi12/10/2024

Genuinely curious: does this make US regulators second-guess breaking up Google? Having a USA company be the first to develop quantum computing would be a major national security advantage.

fguerraz12/09/2024

Am I oversimplifying in thinking that they’ve demonstrated that their quantum computer is better than at simulating a quantum system than a classical computer?

In which case, should I be impressed? I mean sure, it sounds like you’ve implemented a quantum VM.

show 1 reply
dtquad12/09/2024

Do Americans still want to breakup the big US tech companies like Google? With proper regulation it feels like their positive externalities, like this, is good for humanity.

show 2 replies
whiplash45112/10/2024

Newbie's question: how far is the RCS benchmark from a more practical challenge such as breaking RSA?

The article concludes by saying that the former does not have practical applications. Why are they not using benchmarks that have some?

show 2 replies
Sakurai_Quantum12/09/2024

Great time to learn the mathematical foundations of quantum computing rigorously: https://quantumformalism.academy/mathematical-foundations-fo.... All self-paced with problem sheets (proof challenges) and worked out solutions, etc.

show 1 reply
yalogin12/10/2024

Trying to understand how compute happens in Quantum computers. Is there a basic explanation of how superposition leads to computing?

From chatgpt, "with n qubits a QC can be in a superposition of 2^n different states. This means that QCs can potentially perform computations on an exponential number of inputs at once"

I don't get how the first sentence in that quote leads to the second one. Any pointers to read to understand this?

andrewla12/09/2024

This is yet another attempt to posit NISQ results (Noisy Intermediate Scale Quantum) as demonstrations of quantum supremacy. This does not allow us to do useful computational work; it's just making the claim that a bathtub full of water can do fluid dynamic simulations faster than a computer with a bathtub-full-of-water-number-of-cores can do the same computation.

If history is any guide we'll soon see that there are problems with the fidelity (the system they use to verify that the results are "correct") or problems with the difficulty of the underlying problem, as happened with Google's previous attempt to demonstrate quantum supremacy [1].

[1] https://gilkalai.wordpress.com/2024/12/09/the-case-against-g... -- note that although coincidentally published the same day as this announcement, this is talking about Google's previous results, not Willow.

radioactivist12/09/2024

Some of these results have been on the arxiv for a few months (https://arxiv.org/abs/2408.13687) -- are there any details on new stuff besides this blog post? I can't find anything on the random circuit sampling in the preprint (or its early access published version).

show 1 reply
ipsum212/09/2024

They renamed quantum supremacy to "beyond-classical"? That's something.

show 2 replies
zelon8812/09/2024

> It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.

Can someone explain to me how he made the jump from "we achieved a meaninful threshold in quantum computing performance" to "The multiverse is probably real."

show 2 replies
gordon_freeman12/09/2024

So one of the interesting comparisons between Quantum computing vs classical in the video: 5 mins vs 10^25 years. So are there any tradeoffs or specific cases in which the use cases for Quantum computing works or is this generic for "all" computing use cases? if later then this will change everything and would change the world.

show 4 replies
raminf12/10/2024

Relatives were asking for a basic explainer. Here's a good one by Hannah Fry: https://youtu.be/1_gJp2uAjO0

ramonverse12/10/2024

Can't wait for Google to release a breakthrough paper in 5 years just for the authors to leave and build OpenQuant

htrp12/09/2024

105 qubits

tombert12/09/2024

Interesting; it might be time for me to load up a quantum simulator and star learning how to program these things.

I've pushed that off for a long time since I wasn't completely convinced that quantum computers actually worked, but I think I was wrong.

show 1 reply
dom9612/09/2024

Can anyone comment on how this chip is built? What does the hardware look like?

attentionmech12/10/2024

What does it mean when they say that the computations are happening in multiverse? I didn't know we are that advanced already :)

show 1 reply
staunton12/09/2024

This is a great technical achievement. It gives me some hope to see that the various companies are able to invest into what is still very basic science, even if it were mostly as vanity projects for advertising purposes.

Quantum computing will surely have amazing applications that we cannot even conceive of right now. The earliest and maybe most useful applications might be in material science and medicine.

I'm somewhat disappointed that most discussions here focus on cryptography or even cryptocurrencies. People will just switch to post-quantum algorithms and most likely still have decades left to do so. Almost all data we have isn't important enough that intercept-now-decrypt-later really matters, and if you think you have such data, switch now...

Breaking cryptography is the most boring and useless application (among actual applications) of quantum computing. It's purely adversarial, merely an inconsequential step in a pointless arms race that we'd love to stop, if only we could learn to trust each other. To focus on this really betrays a lack of imagination.

show 1 reply
rkagerer12/10/2024

How much support infrastructure does this thing need? (eg. Cryogenic cooling?) How big is a whole 'computer' and how much power draw?

jxdxbx12/10/2024

More misleading random circuit sampling benchmarks. All it proves is that Google has built a quantum computer that does quantum things.

hi4112/10/2024

If error is getting corrected, doesn’t it mean lower entropy? If so where else is entropy increasing, if it is a valid question to be asked.

show 1 reply
bn-l12/09/2024

This is weird. I got this pop up halfway through reading:

> After reading this article, how has your perception of Google changed? Gotten better Gotten worse Stayed the same

mvkel12/10/2024

Seems to coincide with Google's annual quantum announcement, which has happened each fall since 2014

show 1 reply
sys3276812/09/2024

In other words, get off the cloud so nobody has your encrypted data which they will be able to crack in a few minutes five or ten years from now?

show 1 reply
robot12/09/2024

there is so much skepticism on quantum computing that instead of inflated marketing words one should always start by what the biggest problems are, how they are not still solved yet, and then introduce what the new improvement is.

Otherwise there is no knowing if the accomplishment is really significant or not.

show 1 reply
whimsicalism12/09/2024

We need to seriously think if our systems/society are even remotely ready for this.

show 4 replies
Havoc12/10/2024

So is it time to 100x key length on browsers yet or not?

show 1 reply
beyondCritics12/09/2024

>It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.

Makes sense, or doesn't it? What's your take on the multiverse theory?

wslh12/09/2024

ELI5: what I could do if I have this chip at home?

show 2 replies
webdevver12/10/2024

what is this actually useful for?

nuz12/09/2024

Every time this comes up people say they're not actually useful for ML. Is that true? And if not what would they be useful for

show 4 replies

🔗 View 19 more comments