> In some sense, this is due to the fact that the state space grows exponentially
What I take issue with is precisely the conflation of the size of the state space with the quantum speedup. Shor's algorithm is fast because QFT (quantum fourier transform) creates an interference pattern that can reveal the period of certain functions, and QFT can be implemented efficiently because of its specific structure. As I said before, the size of a classical state space of a probability distribution is also exponentially large, so no, the root cause is emphatically not the size of the state space, but the way in which that space can be manipulated and the fact that amplitudes add up in a way that's not linear (when looking at the resulting probabilities).
Note that Grover's algorithm achieves only a quadratic speedup with the same size of state space as Shor's. Your explanation doesn't add up, it just adds to the confusion.
I just think that it's very important to stay far away from the (wrong, but pervasive in pop science) idea that quantum computers are fast because they "try exponentially many solutions in parallel". Excessively highlighting the size of the state space is already a step too far in that direction for my taste.
My words are a bit harsh, but I do appreciate the fact that you are engaging honestly, and please don't take my skepticism personally. I would like to hear what your technology brings to the table, how it differs from competing approaches, etc.
I am in agreement with your sentiment here. Adding a qubit does not mean that every single thing you do on a quantum computer doubles in speed, which is a possible way to interpret some of my statements.
From a purely personal perspective, I do think that it is very interesting that we can affect the entirety of a state with an otherwise linear number of physical operations. Whether that is useful in providing lots of exponential or even polynomial speedups in the arena of practical algorithms is yet to be determined. I suspect that with a robust enough computer, the answer will be a resounding "yes".
> As I said before, the size of a classical state space of a probability distribution is also exponentially large...
This is true, but a single state in a classical probability distribution is not exponentially large. Because of superposition, a single quantum state can be associated with an exponentially large number of amplitudes. As you mentioned, quantum algorithms rely on the interference of these amplitudes. However, if you could somehow assign a complex amplitude to each state in a classical probability distribution, you would still be limited to manipulating only one amplitude at a time. It is in this sense that the exponential scaling is important.
> Note that Grover's algorithm achieves only a quadratic speedup with the same size of state space as Shor's. Your explanation doesn't add up, it just adds to the confusion.
I didn't mean to imply that all quantum algorithms have superpolynomial speedups. But (especially) for the ones that do, I about the exponentially large set of amplitudes being manipulated in parallel.
> I just think that it's very important to stay far away from the (wrong, but pervasive in pop science) idea that quantum computers are fast because they "try exponentially many solutions in parallel".
> However, if you could somehow assign a complex amplitude to each state in a classical probability distribution, you would still be limited to manipulating only one amplitude at a time.
This is probably just more confusing. What I should say is that classical probabilities have no physical manifestation that you can directly manipulate - they just denote our lack of information about a system. Amplitudes in quantum systems can be related to probabilities, but they don't represent lack of information. The probabilistic nature of quantum systems is deeper than that: measurements project superposition states onto classical states in a probabilistic way. This is
For exponentially large superposition states, there are an exponential number of amplitudes. When we act on the state in certain ways, we update all of the amplitudes in parallel. There is no counterpart to this when acting on classical states, even when you have incomplete information about the state (and thus an exponentially large probability distribution).
> But (especially) for the ones that do, I about the exponentially large set of amplitudes being manipulated in parallel.
> However, if you could somehow assign a complex amplitude to each state in a classical probability distribution, you would still be limited to manipulating only one amplitude at a time.
This is probably just more confusing. What I should say is that classical probabilities have no physical manifestation that you can directly manipulate - they just denote our lack of information about a system. Amplitudes in quantum systems can be related to probabilities, but they don't represent lack of information. The probabilistic nature of quantum systems is deeper than that: measurements project superposition states onto classical states in a probabilistic way. But before this projection, we're forced to say that the physical state of the system is in superposition. Even more, the amplitudes accociated with each part of the superposition state are part of the physical definition of the state. In this sense, they are more "real" than classical probabilities.
For exponentially large superposition states, there are an exponential number of amplitudes. When we act on the state in certain ways, we update all of the amplitudes in parallel. There is no counterpart to this when acting on classical states, including when you have incomplete information about the state (and thus an exponentially large probability distribution).
> But (especially) for the ones that do, I about the exponentially large set of amplitudes being manipulated in parallel.
Let me try again. The built-in exponential in the physical state (as I described above) helps me see how quantum speedups (especially super-polynomial ones) could even be possible. You're right that there's more to the story than just having an exponentially large number of amplitudes, but it's an important part of the story!
What I take issue with is precisely the conflation of the size of the state space with the quantum speedup. Shor's algorithm is fast because QFT (quantum fourier transform) creates an interference pattern that can reveal the period of certain functions, and QFT can be implemented efficiently because of its specific structure. As I said before, the size of a classical state space of a probability distribution is also exponentially large, so no, the root cause is emphatically not the size of the state space, but the way in which that space can be manipulated and the fact that amplitudes add up in a way that's not linear (when looking at the resulting probabilities).
Note that Grover's algorithm achieves only a quadratic speedup with the same size of state space as Shor's. Your explanation doesn't add up, it just adds to the confusion.
I just think that it's very important to stay far away from the (wrong, but pervasive in pop science) idea that quantum computers are fast because they "try exponentially many solutions in parallel". Excessively highlighting the size of the state space is already a step too far in that direction for my taste.
My words are a bit harsh, but I do appreciate the fact that you are engaging honestly, and please don't take my skepticism personally. I would like to hear what your technology brings to the table, how it differs from competing approaches, etc.