This Scientific American video, shared by Aeon, is pretty good if you’re looking for a quick basic primer on quantum computing. It’s short, less than nine minutes. Although I do have a beef which I’ll discuss below.

The beef, which is pretty common with popular explanations of quantum computing, is it implies than on a measurement, the circuit of qubits will somehow collapse to the right answer. This always used to confuse me, because the physics is that the collapse is random (at least operationally). Done just as the video suggests, a quantum computer would be a very expensive system to produce random answers.

As often discussed by Scott Aaronson, these systems have to promote the right answer, to amplify the probability of it being the collapsed result, and suppress the probability of the wrong answers. It does this using quantum interference, canceling out the undesirable answers while amplifying the correct one.

Incidentally, getting an idea of how this manipulation of quantum effects works is what pushed me over the line toward wave function realism (at least to some degree). Until reading about quantum computing, I was fine with the idea that the wave function is just a mathematical tool that doesn’t describe actual reality. But this technology depends on the structure of the theory being right throughout the entire process. Holding on to anti-realism in this case just seemed increasing obdurate, at least to me.

The video didn’t help? Sorry John. If it makes you feel any better, it took me a long time to grasp it. (And it isn’t hard to quickly question me beyond what I know.)

It might help to understand that any given quantum “circuit” (algorithm) is run many times exactly because there is some element of randomness in the measurement. Because of the amplification of the desired result and suppression of undesired results, the circuit produces the desired result most often.

As an example (I have an account at the IBM Quantum Computing Experience site), I ran a circuit that created an entangled Bell pair that were correlated. When I submitted the job, one parameter is how many “shots” and I used 1024. The results were:

00: 456
01: 26
10: 60
11: 482

Which was what I expected. Upon measuring a correlated pair, QM tells us we should get either 00 or 11 with equal odds and never 01 or 10. But this is a quantum simulation so there is some probability of getting “incorrect” answers. Pretty clearly the correct answers win and are close to 50/50, but some shots gave 01 or 10.

It’s great that you’re leaning realist, but I’m not clear exactly why? Both realist and anti-realist views agree with QC results.

It seems like the realist and anti-realist always work with the same mathematics and have the same predictions. But I was impressed with the fact that QC needs the wave function to be accurate, not just at certain points, but dynamically throughout all the processing. Maybe that was well established long ago in scientific experiments, but seeing it successfully utilized in technology really made it harder to rationalize away, at least for me.

Of course, this is essentially the no-miracles argument that realists have always used, that for a theory to be consistently accurate in its predictions, yet not reflect reality, at least to some degree, would be a miracle.

Indeed (and one of the reasons I fall on the realist side). That said, some anti-realist interpretations don’t deny realism so much as see it like Kant’s noumena — something we can never hope to access.

You may recall Von Neumann and his Process 1 and Process 2 of QM. The latter is a basic axiom that all interpretations agree about, that per the Schrödinger equation (or its relativistic version), a quantum system evolves in a fully deterministic (and reversible) way. If the system state is known at any time, the equation provides its state at any time past or future. It’s the quantum analogue of Newton’s laws of motion for classical physics.

The first Process, of course, measurement/observation/collapse/whatever is the one that causes all the angst and debate. QC uses both, one to evolve the system towards a desired result state and the other to obtain an output.

I fear I’ll never understand this.

LikeLiked by 1 person

The video didn’t help? Sorry John. If it makes you feel any better, it took me a long time to grasp it. (And it isn’t hard to quickly question me beyond what I know.)

LikeLiked by 1 person

Oh I watched it, but it’s still a mystery.

LikeLiked by 1 person

You and me together

LikeLiked by 1 person

It might help to understand that any given quantum “circuit” (algorithm) is run many times exactly because there is some element of randomness in the measurement. Because of the amplification of the desired result and suppression of undesired results, the circuit produces the desired result most often.

As an example (I have an account at the IBM Quantum Computing Experience site), I ran a circuit that created an entangled Bell pair that were correlated. When I submitted the job, one parameter is how many “shots” and I used 1024. The results were:

00: 456

01: 26

10: 60

11: 482

Which was what I expected. Upon measuring a correlated pair, QM tells us we should get either 00 or 11 with equal odds and never 01 or 10. But this is a quantum

simulationso there is some probability of getting “incorrect” answers. Pretty clearly the correct answers win and are close to 50/50, but some shots gave 01 or 10.It’s great that you’re leaning realist, but I’m not clear exactly why? Both realist and anti-realist views agree with QC results.

LikeLiked by 1 person

It seems like the realist and anti-realist always work with the same mathematics and have the same predictions. But I was impressed with the fact that QC needs the wave function to be accurate, not just at certain points, but dynamically throughout all the processing. Maybe that was well established long ago in scientific experiments, but seeing it successfully utilized in technology really made it harder to rationalize away, at least for me.

Of course, this is essentially the no-miracles argument that realists have always used, that for a theory to be consistently accurate in its predictions, yet not reflect reality, at least to some degree, would be a miracle.

LikeLike

Indeed (and one of the reasons I fall on the realist side). That said, some anti-realist interpretations don’t deny realism so much as see it like Kant’s noumena — something we can never hope to access.

You may recall Von Neumann and his Process 1 and Process 2 of QM. The latter is a basic axiom that all interpretations agree about, that per the Schrödinger equation (or its relativistic version), a quantum system evolves in a fully deterministic (and reversible) way. If the system state is known at any time, the equation provides its state at any time past or future. It’s the quantum analogue of Newton’s laws of motion for classical physics.

The first Process, of course, measurement/observation/collapse/whatever is the one that causes all the angst and debate. QC uses both, one to evolve the system towards a desired result state and the other to obtain an output.

LikeLiked by 1 person