The other day I shared a video on quantum computing, which I thought was informative, but the feedback I received is that it wasn’t for anyone not already versed in the subject. Since I once struggled to understand this subject myself, I tried to think of a way of describing it that would actually help. This post is my shot at it.

Truth in advertising for anyone reading this: I’m not a physicist or expert in quantum computing, just an interested layperson. And of course, any description of quantum physics other than the math is going to be controversial. So read with these points in mind.

Quantum computing can be thought of as computation happening in a sort of massive parallel computing cluster. But unlike classic clusters, which might have tens, hundreds, or maybe even thousands of nodes, a quantum computer can have astronomically more nodes than any cluster ever built or that will ever be built. It accomplishes this with quantum superposition, interference, and entanglement, terms which will be explained below.

One way to think of quantum superposition is a particle constantly splitting into different versions of itself, zillions of different versions. The versions of the particle form waves and jostle each other (interfere), which leads to the interference patterns in the double slit experiment. The wave of the different versions spread out, until information about the location of the particle gets out into the environment, typically from a measurement. Then all but one of the versions of the particle disappear.

When this happens, there’s no way to know ahead of time which version of the particle will remain. The best quantum theory can do is provide probabilities for each version. This is known as the wave function collapse and is the central mystery of quantum mechanics. Don’t worry if this or any of the rest seems bizarre. It does for everyone. For quantum computing, we just need to accept that it happens.

For entanglement, let’s consider a classical macroscopic example first. Imagine two asteroids traveling through space: asteroid-a and asteroid-b. They pass near enough to each other to gravitationally alter each other’s course. Once that interaction has taken place, asteroid-b’s fate in the universe has been altered by its encounter with asteroid-a, and vice-versa. So there now exists some correlations between them.

Now imagine that later asteroid-b passes by asteroid-c and alters its course. So now, asteroid-a has affected asteroid-b’s fate, but has also indirectly had an affect on asteroid-c’s fate as well. So the correlations are transitive. They spread. We could say that all three asteroids are “entangled” with each other, although at a classical level this is only of limited interest.

Now, let’s imagine two quantum particles traveling through space. Both have been doing the quantum thing, splitting off into zillions of different versions. When they interact, we can model the interaction as two waves interacting. Or we can view it as zillions of interactions happening between different versions of each particle. But just like the asteroid examples above, the particle’s now have correlations between them.

What makes this interesting at a quantum level is that now, instead of having independent versions of particle-a and particle-b, we now have versions of each *combination* of those particles, that is, we have versions of the set of both particles. If we take a measurement of particle-a it will collapse into one definite version. But when we do that, we also know that a measurement of particle-b is guaranteed to collapse to the corresponding version of the set. This is quantum entanglement.

And like the classical version, quantum entanglement is transitive. So if particle-b later interacts with particle-c, particle-a and particle-c will now be entangled. If this continues for large numbers of particles, all of those particles will now be entangled with each other, meaning that they will be in a joint superposition, in which many versions of the entire set exist rather than separate independent versions of each particle.

One thing to keep in mind. As new particles are added to the entangled set, the versions of the overall set get multiplied by the versions of the new particle that are included in the interaction. In other words, sets with more particles have exponentially more versions of the overall set.

A note about quantum spin. The main thing to understand about spin is for any particular measurement, it will only be one of two values. That makes spin a useful property for physically implementing a computational bit. Of course, being quantum, particles can have different versions with one spin and other versions with the opposite spin. These are qubits (quantum bits).

Hopefully all of that was clear, because it serves as the raw material of quantum computing.

A quantum computer uses qubits. A qubit will have a version with the value of 1 and another version with the value of 0. When that qubit interacts with another qubit, they become entangled, so there are now multiple versions of the set of qubit-1 and qubit-2. Add a third, and the entanglement includes three qubits.

Eventually all the qubits in a quantum computer are entangled, which means there are many versions of the entire set. So this could be thought of as the quantum processor constantly splitting into different versions of itself, each able to perform a version of the computation it is currently running. We now have our parallel computing cluster.

That said, there are differences between a quantum computer and a classical cluster. A classic cluster has all its nodes right from the beginning. It also usually has one designated controller node, which is typically the one that will provide the final output of any calculation. And the nodes in a classic cluster often communicate with each other over some type of network.

In the quantum version, we can think of it as starting off with one computer that then begins splitting into different versions. The number of possible versions is a factor of how many qubits it has. Since each qubit can have two versions, the number of possible versions for the overall computer processor is two to the power of the number of qubits. So a ten qubit computer can have 2^{10} or 1024 versions, a fifty qubit computer 2^{50} or over a quadrillion versions. A 300 qubit computer can have 2^{300} or around 10^{90} versions, which is more versions than there are particles in the observable universe.

But there’s a catch. As soon as there is any output from the system, that counts as a measurement, and it will collapse to just one version of the processor. And just like with the lone particle, there’s no way to know ahead of time which version will be there. There’s no way to know which node in our vast cluster will be left standing to provide the output. We can have the cluster try vast numbers of possible solutions, but when one node finds the right one, we can’t just assume it will be the one left standing after the collapse.

So the system has to “promote” the right answer. One way to think of this is the system needing to get the right answer on as many nodes as possible, so when the collapse happens it will be in the output. This happens by the nodes “communicating” the answer to each other, not through a network, but through quantum interference, that is, by controlling the jostling of the different versions of the computer.

So hopefully when the output happens, the right answer is provided. Due to quantum uncertainty, not every node can get the right answer, so there’s always a small chance of the wrong answer coming out. Often this possibility is compensated for by multiple runs of the same algorithm and taking the most frequent answer as the right one.

So, that’s my simplified version of quantum computing shorn of a lot of complications like error correction and other issues (many of which I don’t understand myself). Hope it helps. And that the simplifications didn’t cross over into being misleading anywhere.

Thanks. That did help.

LikeLiked by 1 person

Thanks Anonymole! Grateful for the feedback.

LikeLike

There is just one point I would grizzle about in your explanation of quantum computing, though it may (or may not) be irrelevant in the context. (Disclaimer, I am not any sort of QM expert either, just an interested layman with a background in maths, physics and computing.)

Conflating superposition and the double slit experiment is, at best, highly contentious. Yes, David Deutsch just about does so in arguing for the many-worlds interpretation of QM, but that argument was, I reckon, shot down by Ian Stewart’s observation that Feynman’s explanation of light propagating along geodesics in *this* (to any value of “this” 🙂 )world, would mean all light propagating in exactly the same way in all worlds, which can only happen if all them worlds were in fact identical.

LikeLiked by 1 person

Thanks Mike. I actually tried to stay away from interpretation in this particular post. Although it’s hard to describe quantum computing in any intuitive manner without using language that implies wave function realism to at least some degree.

Associating superposition with the double-slit experiment is contentious? I thought the double-slit was exhibit 1 for superpositions. I may not be catching your meaning here.

Would you happen to have a link, paper name, or something on that Ian Stewart observation? I’m pretty interested in the Everett many-worlds interpretation, particularly any (thoughtful) arguments against it.

LikeLike

Sorry, I expressed myself carelessly. I was trying to point the difference between useful but fragile superposition of qbits (which is what is usually referred to by “superposition” in popular explanations) and the natural superpositions (relative to a basis), implied by Heisenberg’s uncertainty, which are omnipresent and in no way controversial. But my mind was already ahead of my typing, on David Deutsch’s argument for MW. Call it a momentary lapse of reason. 🙂

Unfortunately, I can’t give you a reference to that point of Ian Stewart’s. It is not in the books by him that I have to hand — I’ve checked. I have a vague feeling that it was in an interview I heard a few years back. His response to Deutsch’s derivation of MW from the double-slit experiment was basically that if QED’s virtual photons destructively interfering with each other were the same as MW ones, interference in our world would have to be the same other worlds, so how could those worlds be different? He is not a fan of any form of Everett’s Relative State variations.

On that my favourite quote is from Harvey Brown (now retired philosophy professor at Oxford). I once attended a joined lecture of his with David Wallace’s, who is almost rabidly 🙂 pro-Everett. Brown started his lecture by saying his role was really to assure the audience that Wallace was not mad (yup, that’s what he said!) and confessing that he (Brown) himself was not a fan of MW, but accepted it on the basis of being a QM interpretation he disliked least. Really made me worm up to him. 🙂

LikeLike

No worries. This is all friendly discussion.

Based on your description, Deutsch’s way of describing it would be that the interference happens between worlds. So waves canceling each other out happen across the worlds. In this view, decoherence puts an end to it and we get the apparent collapse. Although most Everettians would say the waves canceling each other out happen before the world has split, which happens on decoherence. Different descriptions for the same overall ontology of the universal wave function.

The interpretation Brown dislikes least is an interesting way of putting it. It reminds me of Scott Aaronson and his somewhat backhanded endorsement of the interpretation. My own credence in it is roughly 50%, in the sense of whether the raw QM formalism is the whole picture. That leaves 50% for hidden variable and other realist solutions, although any one interpretation in that family is well south of 10% for me, at least for now. (In terms of anti-real interpretations, my take is someone is free to sign on to whichever one disturbs them the least.)

LikeLike

Wouldn’t the quantum erasure experiments speak against the interference ending at decoherence? Specifically, observing the particles going thru the slits destroys the overall interference pattern, but if you look at specific subsets of the particles, the interference pattern is still there. Not sure how Deutsch would address this.

*

LikeLike

I’d have to go back and refresh my knowledge on the details, but I think he’d point out that decoherence hasn’t happened in that case, at least not up to the point where you reestablished the interference pattern. That’s only possible because the wave is still coherent at that point. (In principle it’s possible to do it after decoherence, but it would require reversing all the entropic changes from the environment.)

LikeLike

I’m confused. In the two slit experiment we say that decoherence happened and so we lost the interference pattern. So did it or didn’t it? Because we can look at a subset of the data and see the interference pattern is still there.

LikeLike

In the classic double slit, decoherence happens when the particle hits the screen. There are things you can do to make it happen sooner (such as put a detector at one of the slits). There are also ways to alter the path of the particle that don’t cause decoherence, typically with mirrors or half silvered mirrors, but that’s getting more into the Mach–Zehnder setup.

The key event is when “which way” information gets out into the environment. Put another way, when the causal effects of the quantum system in question are amplified into the environment and random effects from the environment have broken up the timing of the system. Once that happens, it’s done.

LikeLike

Right, but I’m talking about the quantum erasure thing (I think), where they put a detector at one of the slits, so “once that happens, it’s done”. Except that they look at a subset of those particles, and they find an interference pattern, so it didn’t happen, and so it wasn’t done?

LikeLike

I still think the key event is when “which way” information gets into the environment. If the experiment has that information in its apparatus at some point, but it is reversed (erased) before it gets into the environment, then the information never gets out into the world.

Of course, a lot depends on exactly how the information is “recorded”. If the recording device is complex enough, the info gets out in various ways before it’s “erased”. In practice, the “recording” needs to be something simple and isolated, like another entangled particle in one state or another depending on the information. That’s a miniscule degree of decoherence that can still be reversed. (Decoherence is entanglement en masse.)

Sean Carroll has an interesting post about this: https://www.preposterousuniverse.com/blog/2019/09/21/the-notorious-delayed-choice-quantum-eraser/

LikeLike

I used to dislike MW a lot, until it was pointed out to me that its ontology is in fact no more baroque than that of an old-fashioned infinite universe. Intellectually, I am now sitting on the fence, like you, but I still dislike it emotionally. The option I would like best to be true is Penrose’s gravity induced wave collapse or something on those lines.

LikeLike

That’s a good comparison. It’s often noted that the situation with many-worlds is a lot like Copernicanism in the late 1500s. Before Galileo’s telescope, nobody knew whether it was true or not, but most people rejected it out of hand. One of the reasons was because the stars showed no parallax, indicating that if the Earth moved, they had to be incomprehensibly far away and the universe unimaginably vast and empty. Why would God create such an obscene arrangement?

Of course, that doesn’t mean that this time the obscene arrangement will turn out to be true. But I’m keeping an eye on experiments that stress the quantum formalism, particularly the ever larger systems held in superposition or entanglement.

I don’t really have an emotional preference in the quantum interpretations. I find all the realist ones deeply unsettling in their own way.

LikeLike

Maybe I’m missing something, but I think you need to draw on the features of complex numbers and their amplitudes, in order to explain the advantages of quantum computing. It’s because the states are describable by these two-dimensional complex numbers that their dot products (their interactions) can cancel out, or very close to cancel. This is very important to the proper functioning of the circuits, if I understand correctly.

LikeLiked by 1 person

That’s all true, but more involved than I wanted to get for this post. It was aimed at someone just looking for a very casual understanding of quantum computing.

LikeLike

Mike, my question is, how can you ease me into a little bit better understanding of what is meant by “promoting the correct” answer? I’ve not ever really tried to fathom this out, but I gather that to have a meaningful computer we need to have the quantum entanglement thing for an ensemble of qubits AND we need it to have some sort of logical architecture or framework that works like a normal computer (like blocks of adders for instance that can do addition). What can you say about this without getting too far into the weeds?

I have a dim understanding of how adders work in a conventional computer, and am imagining that somehow the process in a quantum computer is a bit like feeding a conventional block of adders (I have no idea about computer science terminology so bear with me) with a superposition of many combinations of input values simultaneously rather than cycling through an array of possible values one at a time in serial fashion.

Let’s say I’m testing a thousand pairs of numbers to see if any add up to 1,472,999. If we test all thousand pairs simultaneously through a quantum process, (and assuming one and only pair is correct), then promotion is how we get the quantum computer to output the correct pair.

It strikes me that something physical and quantum has to happen, maybe like decoherence, where pairs of input numbers that do not give the correct sum are somehow lost as noise while the correct value does a quantum Darwin-esque thing and gets multiplied throughout the “environment” where outputs are registered.

I can imagine a receiving “environment” that must somehow repeatedly register and thus multiply the correct value. In decoherence, my understanding is a particular result for a quantum process is multiplied many times in the natural world because that particular state is one that is stable through repeated interactions in the natural world. I cannot explain that well because it’s been a long time since I read about it, and even then cannot claim any expertise obviously, but I recall it being something like this: certain states arise that remain the same after just about any interaction with the environment, (because they are eigenvalues maybe of a density matrix? And thus any transformation they undergo through interaction with other systems gives the same state?)…

Is it something remotely like this?

LikeLiked by 1 person

Michael,

You’re hitting me in the area about this that I’m probably least able to respond with too much detail. But I will say that the promotion uses quantum interference, which requires that the quantum circuit be in coherent state. It all has to happen before environmental decoherence, because once decoherence gets going in earnest, it’s too late.

For interference, remember that in the double slit experiment, we have dark bands and bright bands on the back screen. That comes about through destructive and constructive interference. With promotion (I have no idea if that’s the right term), the goal is to do constructive interference on the right answer, increasing its probability, and destructive interference on the wrong ones, decreasing their probability to as close to zero as possible.

Here’s Scott Aaronson’s description.

That link, by the way, is an excellent source of information on quantum computing. It can get pretty weedy, but each section usually starts out at a relatively high level.

To your other questions, yes it happens through the circuits of quantum logic gates, although I couldn’t begin to tell you how to construct one that performs the above.

Hope this helps.

LikeLike

I didn’t realize quantum computers could sometimes spit out the wrong answer. I guess there’s no reason to think they wouldn’t do that. It just never occurred to me before that they would.

LikeLiked by 1 person

Strictly speaking, even a classic computer could put out a wrong answer, but they’re engineered to make that exceedingly rare. (Which actually comes with costs, notably in power consumption.) Quantum computers can’t make it nearly as rare, so they have to compensate.

LikeLiked by 1 person

SMBC takes a shot at explaining this that’s worth checking out.

LikeLike

To quote Quine (“Logic and the Reification of Universals”, _From a Logical Point of View_): ” What is under consideration is not the ontological state of affairs, but the ontological commitments of a discourse. What there is does not in general depend on one’s use of language, but what one says there is does.”

LikeLike

Thanks Mike. So reality doesn’t depend on language, but what we

sayabout reality does. Am I getting the right takeaway?LikeLike

Yes, you read it right. I think a lot of philosophical arguments are fuelled by a failure to allow for this distinction.

LikeLiked by 1 person