I know what a Hilbert space is, and I also know that this 2^n-dimensional space cannot be accessed except through a destructive measurement operation. An exponential state space does not imply that there is exponential computing power to be harnessed there.
As an analogy, when you execute a randomized classical algorithm, the size of the state space in which the bits live is also exponential (and at the end you observe the result, and your uncertainty collapses from a probability distribution to one of its possible outcomes). Yet you would look at me like I'm crazy (or a fraud) if I claimed that randomized algorithms have exponentially more computing power than deterministic ones.
The only way in which the quantum case differs from the classical picture above, is that amplitudes have a phase and can thus interfere (constructively or destructively). The art of creating quantum algorithm lies entirely in orchestrating favorable interference patterns.
The way I like to explain it simply to people is that right now, you can use frameworks to manipulate probability distributions. Quantum logic gates are basically a restriction on the operations you can use to combine pdf functions. Ultimately, unless you can find a clever way to convert an algorithm into one that uses pdfs and then achieves a pdf where one single value has 99% of the EV, QM aint gonna help ya.
It seems like we should be more careful when saying "exponential" increase in computational performance. For many quantum algorithms, the speedup is actually superpolynomial [1] [2]. In some sense, this is due to the fact that the state space grows exponentially but, as you correctly pointed out, it can only be accessed in a destructive manner. For many algorithms (e.g. Shor's), the net result is a superpolynomial improvement in the resources required for solving a practically important problem (factoring).
Unfortunately, the nuance of superpolynomial vs exponential is lost in many high-level discussions about quantum computing. Maybe we should just say "much, much faster" ;) To make matters worse, quantum computing textbooks often present Simon's Problem [3] as a showcase for truly exponential speedup. It turns out this is misleading, as I've never heard of a practically relevant algorithm with truly exponential speedup.
> In some sense, this is due to the fact that the state space grows exponentially
What I take issue with is precisely the conflation of the size of the state space with the quantum speedup. Shor's algorithm is fast because QFT (quantum fourier transform) creates an interference pattern that can reveal the period of certain functions, and QFT can be implemented efficiently because of its specific structure. As I said before, the size of a classical state space of a probability distribution is also exponentially large, so no, the root cause is emphatically not the size of the state space, but the way in which that space can be manipulated and the fact that amplitudes add up in a way that's not linear (when looking at the resulting probabilities).
Note that Grover's algorithm achieves only a quadratic speedup with the same size of state space as Shor's. Your explanation doesn't add up, it just adds to the confusion.
I just think that it's very important to stay far away from the (wrong, but pervasive in pop science) idea that quantum computers are fast because they "try exponentially many solutions in parallel". Excessively highlighting the size of the state space is already a step too far in that direction for my taste.
My words are a bit harsh, but I do appreciate the fact that you are engaging honestly, and please don't take my skepticism personally. I would like to hear what your technology brings to the table, how it differs from competing approaches, etc.
I am in agreement with your sentiment here. Adding a qubit does not mean that every single thing you do on a quantum computer doubles in speed, which is a possible way to interpret some of my statements.
From a purely personal perspective, I do think that it is very interesting that we can affect the entirety of a state with an otherwise linear number of physical operations. Whether that is useful in providing lots of exponential or even polynomial speedups in the arena of practical algorithms is yet to be determined. I suspect that with a robust enough computer, the answer will be a resounding "yes".
> As I said before, the size of a classical state space of a probability distribution is also exponentially large...
This is true, but a single state in a classical probability distribution is not exponentially large. Because of superposition, a single quantum state can be associated with an exponentially large number of amplitudes. As you mentioned, quantum algorithms rely on the interference of these amplitudes. However, if you could somehow assign a complex amplitude to each state in a classical probability distribution, you would still be limited to manipulating only one amplitude at a time. It is in this sense that the exponential scaling is important.
> Note that Grover's algorithm achieves only a quadratic speedup with the same size of state space as Shor's. Your explanation doesn't add up, it just adds to the confusion.
I didn't mean to imply that all quantum algorithms have superpolynomial speedups. But (especially) for the ones that do, I about the exponentially large set of amplitudes being manipulated in parallel.
> I just think that it's very important to stay far away from the (wrong, but pervasive in pop science) idea that quantum computers are fast because they "try exponentially many solutions in parallel".
> However, if you could somehow assign a complex amplitude to each state in a classical probability distribution, you would still be limited to manipulating only one amplitude at a time.
This is probably just more confusing. What I should say is that classical probabilities have no physical manifestation that you can directly manipulate - they just denote our lack of information about a system. Amplitudes in quantum systems can be related to probabilities, but they don't represent lack of information. The probabilistic nature of quantum systems is deeper than that: measurements project superposition states onto classical states in a probabilistic way. This is
For exponentially large superposition states, there are an exponential number of amplitudes. When we act on the state in certain ways, we update all of the amplitudes in parallel. There is no counterpart to this when acting on classical states, even when you have incomplete information about the state (and thus an exponentially large probability distribution).
> But (especially) for the ones that do, I about the exponentially large set of amplitudes being manipulated in parallel.
> However, if you could somehow assign a complex amplitude to each state in a classical probability distribution, you would still be limited to manipulating only one amplitude at a time.
This is probably just more confusing. What I should say is that classical probabilities have no physical manifestation that you can directly manipulate - they just denote our lack of information about a system. Amplitudes in quantum systems can be related to probabilities, but they don't represent lack of information. The probabilistic nature of quantum systems is deeper than that: measurements project superposition states onto classical states in a probabilistic way. But before this projection, we're forced to say that the physical state of the system is in superposition. Even more, the amplitudes accociated with each part of the superposition state are part of the physical definition of the state. In this sense, they are more "real" than classical probabilities.
For exponentially large superposition states, there are an exponential number of amplitudes. When we act on the state in certain ways, we update all of the amplitudes in parallel. There is no counterpart to this when acting on classical states, including when you have incomplete information about the state (and thus an exponentially large probability distribution).
> But (especially) for the ones that do, I about the exponentially large set of amplitudes being manipulated in parallel.
Let me try again. The built-in exponential in the physical state (as I described above) helps me see how quantum speedups (especially super-polynomial ones) could even be possible. You're right that there's more to the story than just having an exponentially large number of amplitudes, but it's an important part of the story!
As an analogy, when you execute a randomized classical algorithm, the size of the state space in which the bits live is also exponential (and at the end you observe the result, and your uncertainty collapses from a probability distribution to one of its possible outcomes). Yet you would look at me like I'm crazy (or a fraud) if I claimed that randomized algorithms have exponentially more computing power than deterministic ones.
The only way in which the quantum case differs from the classical picture above, is that amplitudes have a phase and can thus interfere (constructively or destructively). The art of creating quantum algorithm lies entirely in orchestrating favorable interference patterns.