Quantum computing 101with D-Wave’s Vern Brownell

I found this to be an interesting primer on quantum computing.  One take away for me is that quantum processors will be useful for specific purposes, not necessarily as general purpose devices.  This implies to me that we might someday have computers with separate quantum processors with specific jobs delegated to it by the classic processors.

Another take away is that a lot of energy has to go into insuring that the qubits, the basic processing units of quantum computing, are isolated from the environment to prevent premature decoherence.  It’s why so many of the quantum computing prototypes operate near absolute zero temperature.

One thing I didn’t get (I might have missed it) is how soon practical systems are expected.  It seems like quantum computing requires an expensive rig right now.  Of course, the first classic computers were also outrageously expensive, so there’s good reason to think time will eventually smooth that out.

10 thoughts on “Quantum computing 101with D-Wave’s Vern Brownell

  1. I imagine that we are at the same place classical computers were in the 1950s – basically the research stage. Another decade and a small number may be in commercial use. Twenty years and they will be much more available. I envisage that large quantum computers could be made available via the cloud, rather than needing to have one sitting on your desk.

    Like

    1. You may be right. Certainly if isolating the components from the environment continue to require what it now does, that might have to be the model. There’s no guarantee that it will evolve to us having it in our pockets as classical ones did.

      Like

  2. Thanks for the link. The quantum computing was praised even by Steven Weinberg, that is, it ‘should’ be something of real value. But, this video just beats around the bushes (about the difficulty of providing a noise free environment), does not give any beef (the details of the qubit-logic). I thus went to their homepage but was disappointed as well.

    It says at the ‘Vision Statement’, “Despite the incredible power of today’s supercomputers, there are many complex computing problems that can’t be addressed by conventional systems.”

    But, it did not give any example of what kind of complex computing problems are beyond the reach of the classic computers. Then, of course, there is no way to show that quantum computing can be the key for those problems.

    It says, “… to use quantum effects [adding qubit] to compute.”
    But, it fails to show what kind of advantage this can bring. By adding a ‘qubit’, it is basically changing a ‘binary-computing’ system into a ‘ternary-computing’ system. As ‘qubit’ has quantum effect, that is, it encompasses a special process (the quantum collapse). With this quantum collapse, the qubit at most produces a ‘fuzzy’-ternary-computing system.
    By the way, it fails to describe the ‘quantum collapse’ mechanism for this qubit. As this qubit ‘should’ collapse at various points during the computation, what big difference is there between that ‘selected’ collapse-mechanism and the random-collapse caused by the ‘environment’? Without answering this question, all that superconducting environment is just a hogwash, a gimmick.

    It says, “The processor considers all possibilities simultaneously to determine the lowest energy required to form those relationships. … Multiple solutions are returned to the user, scaled to show optimal answers.”
    This can be done by classic computing. Absolutely, nothing new here.

    Like

    1. You ask a lot of the same questions I do on this. Disagreeable Me explained it to be telling me to imagine logic circuits built from a series of two-slit-experiments. Properly constructed, you could have a lot of physical processes take place in parallel prior to the collapse at the end, with the architecture somehow designed to insure the desired version of the calculations emerge at that collapse.

      That said, given the indeterministic nature of QM, I’ve never heard a good explanation of how the final step described above can be insured. I’ve read discussions about doing it with entanglement, but as I understand it, entanglement cannot be used for communication. It seems like there would be an element of randomness in the result, that’s never struck me as particularly advantageous for performance.

      Like

      1. This is a very interesting and important issue. Thanks for bringing it up. I think that we (as humanity) should ask some deep questions on this.

        ‘Qubit’ should be as a ‘logic’ first before to be implemented as ‘hardware’. For qubit to become a ‘computing logic’, we should first know its ‘framework’.

        First, is it a (2 + 1) logic? That is, the qubit is an ‘added on” to the classic binary logic (system). So, the qubit is really a sidekick, kicking the binary-computing from time to time. Yet, we still need to know the followings.
        a. How is qubit generated in a computing pathway?
        b. What roles do qubit play during the computation? Actively changing the computing pathways or just acting as ‘possible-states’ as final ‘choices’? For playing as ‘possible-states’, it can be done with algorithm, no quantum effect is needed. Furthermore, the parallel computing is an old school issue, quantum effect is not ‘necessarily’ needed. For changing the computing pathways, does anyone knows whether it will give a ‘meaningful’ outcome?

        Second, it is a ternary logic (system). That is, the qubit is a major player (similar to the 0 and 1), not the sidekick. The ternary-computing framework will be very, very interesting, as it requires a completely new set of logic-system (the ‘and/or’-gates no more). It becomes a ‘color’-computing system which needs very different kind of logic-gates (red, yellow, blue, white). That is, the entire logic-gate-circuits must be invented anew. Obviously, this is not what has been done. By the way, this new-gate chip (many times more complex than the old binary one) does not need a superconducting environment to run.

        By all means, I do not see a (any) reason that the qubit-logic must be run under a superconducting environment. I do see one hint from its statement, “The processor considers all possibilities simultaneously to determine the lowest energy required to form those relationships.” The key word is the ‘lowest energy required’, that is, they do not have a qubit-computing logic but is running a ‘horse-race’ with many ‘parallel-processors’. Thus, the ‘measurement’ of the energy used by those processors becomes the determining factor for the answer ranking, and this ‘measurement’ (not calculation) requires a superconducting environment. If this is the case, its logic validity is totally unproven.

        Of course, if there is ‘one’ (just needs one) example for showing that it can only be done with quantum computing, its value is then certain. If not, then a beauty-contest should be in order. For example, calculating π (pi) to one-millionth digits. Can a quantum computer outperform the classic one?

        Like

        1. Good questions. My understanding is that qubits remain binary, but that they can be in superposition states that allow them to be both 1 and 0 simultaneously. It does seem like all the logic gate designs would look different in a quantum system, but I can’t claim to have a firm enough understanding on this to be sure.

          I think qubits have to be in a superconducting environment is to minimize any “noise” from the environment, to prevent premature decoherence.

          From what I’ve read, very simple quantum computations have been performed, but that’s about it so far. This article is a typical example of what I usually see. You might find it interesting.
          http://www.sciencedaily.com/releases/2014/06/140612142219.htm

          Like

          1. Completely different logic architectures and algorithms are required in quantum computing, since in order for the system to remain in a superposition, all operations must be reversible. NAND and OR gates are not reversible operations.

            Like

          2. Interesting, I’ve never heard that a process must be reversible to remain in a superposition. That seems like a major constraint on what kind of logic circuits can be built.

            Like

          3. Thanks for the link. I have heard about Quantum computing awhile back but never truly care much about for it. Your post made me looking into it the first time and got me a bit interest in it now. I think that the ‘Time’ cover story {(Lev Grossman , Feb. 6, 2014): “Quantum computing uses strange subatomic behavior to exponentially speed up processing. It could be a revolution, or it could be wishful thinking.”} gave a good description of the current situation. Thus, it could be a good time for us to put our two cents in for evaluating the situation.

            One, there is a discrepancy between the ScienceDaily story and the D-Wave claims.

            ScienceDaily story (June 12, 2014): {Encoding the logical qubit in the seven physical qubits was a real experimental challenge,” relates Daniel Nigg, a member of Rainer Blatt’s research group.
            The physicists achieved this in three steps, where in each step complex sequences of laser pulses were used to create entanglement between four neighboring qubits.
            “For the first time we have been able to encode a single quantum bit by distributing its information over seven atoms in a controlled way,” says an excited Markus Müller, who in 2011 moved from Innsbruck to the Complutense University in Madrid.
            “When we entangle atoms in this specific way, they provide enough information for subsequent error correction and possible computations.”}

            In the above passage, it encompasses the following keywords.
            Encoding
            logical qubit
            physical qubits
            laser pulses
            entanglement (entangle atoms)
            possible computations
            From these key words, we can guess (visualize) a framework.
            a. The laser pulses produces entangled atoms which become a ‘stable’ structure (could act as physical qubits).
            b. The ‘information’ (logical qubit) can be ‘encoded’ on those physical qubits, and this information will stay accurate and stable.
            c. When information can be held by a physical structure, the computation becomes ‘possible’.

            This is seemingly only the ‘first’ step. A standstill information is of no use in computation. How to move this bit of information? How to interact with other bits? And many more how’s. Without answers on these how’s, there will be no computation.

            On the other hand, the D-Wave already has a functional product (the D-Wave Two). The following is the properties of the machine, described on its website.
            1. A lattice of 512 tiny superconducting circuits, known as qubits, is chilled close to absolute zero to get quantum effects
            2. A user models a problem into a search for the “lowest point in a vast landscape”
            3. The processor considers all possibilities simultaneously to determine the lowest energy required to form those relationships
            4. Multiple solutions are returned to the user, scaled to show optimal answers

            Seemingly, the D-wave does not use laser pulses to generate the quantum effects. That is, at this point, we seemingly have two different stories on the Quantum computing.

            Like

          4. Thanks for sharing your research!

            Brownell does say that there are several different approaches being pursued right now.

            The “It could be a revolution, or it could be wishful thinking” quote feels like a good assessment. I do tend to think they’ll find one or more ways to do quantum computations. The question is if it will be judged worth the cost. It might turn out to be a Concorde: possible and extravagant, but not worth the cost.

            Like

Your thoughts?

This site uses Akismet to reduce spam. Learn how your comment data is processed.