I've never seen so much money spent on a fundamentally flawed tech, since maybe Theranos. I'm really starting to doubt the viability of the current crop of quantum computing attempts. I think there probably is some way to harness quantum effects, but I'm not sure computing with inherently high margin of error is the right way to do it.
I'm optimistic about current quantum computers, because they are a tool to study wave function collapse. I hope that they will help to understand the relation between the number of particles and a time how long a system can stay in entangled state, which will point to a physical interpretation of quantum mechanics (different from "we don't talk about wave function collapse" Copenhagen interpretation).
> fundamentally flawed tech, since maybe Theranos
That's a pretty dramatic claim. We've had to (and still have to) deal with the same class of problems when going from analog -> digital in chips, communications, optics, etc. etc. The primitives that reality gives us to work with are not discrete.
I think quantum computing research makes a lot more sense through the lens of “real scientists had to do something for funding while string theory was going on”.
Quantum computing may or may not get industrial results in the next N years, but those folks do theory, they often if not usually (in)validate it by experiment: it’s science.
I feel like these are extremely different things being compared.
For a lot of technology, most really, the best way to study how to improve it is to make the best thing you know how to and then work on trying to make it better. That's what's been done with all the current quantum computing attempts. Pretty much all of the industry labs with general purpose quantum computers can in fact run programs on them, they just haven't reached the point where they're running programs that are useful beyond proving out and testing the system.