General · Language · Media · Problem · Site

Quantum computing explained – 5 levels of difficulty

Update July 11, 2022

Quantum advantage? – the long road ahead to making a useful quantum computer.

• Wired > “Quantum Advantage Showdowns Have No Clear Winners” by Sophia Chen (July 11, 2022) – A series of recent experiments between quantum and classical computers shows the term’s ever-evolving meaning.

Each claim of quantum advantage has set off other researchers to develop faster classical algorithms to challenge that claim.

Quantum computer
Credit: Pixabay/CC0 Public Domain

Original post February 19, 2022

So, how is quantum physics connected with quantum computing?

Ask people on the street “What is a quantum computer?” and you’ll likely get a variety of replies. Huh? Media buzz / hype, fiction / myths, degrees of reality / research. So, levels of understanding.

A “classic” 2021 Wired article [1] re Majorana fermions included a 2018 video of IBM’s Dr. Talia Gershon (Senior Manager, Quantum Research) explaining quantum computing to 5 different people: a child, teen, a college student, a grad student and a professional.

“5 levels” is a format referenced in other posts, but particularly noteworthy is Wired’s “5 levels” video series (16 episodes released 2017 – 2022).

• Wired > “Quantum Computing Expert Explains One Concept in 5 Levels of Difficulty” (Released on 06/25/2018)

Dr. Gershon uses some props: a model of a quantum computer aka “the chandelier” and spinning coins.

And she notes IBM’s Quantum Computing initiative and free cloud access to some current quantum computers. And an exciting technology future.

I don’t think you’re gonna have one in your dorm room anytime soon but you’ll have access to one. There’s three free quantum computers that are all sitting in this lab here that anyone in the world can access through the cloud.

You know now that everybody around the world can access a quantum computer through the cloud, people are doing all kinds of cool things. They’re building games. [Quantum pong?]

This is such an exciting time in the history of quantum computing. Only in the last couple years have real quantum computers become available to everyone around the world. This is the beginning of a many decade adventure where we’ll discover so many things about quantum computing and what it’ll do. We don’t even know all of the amazing things it’s gonna do. And to me that’s the most exciting part.

Quantum computing is a fascinating endeavor to engineer and develop applications which rely on quantum properties: “So superposition is one quantum property that we use, entanglement is another quantum property, and a third is interference.”

The quantum computing romance has begun – as Ray Bradbury noted in a 1971 panel discussion:

“I think it’s part of the nature of man to start with romance and build to a reality. In order to get the facts we have to be excited to go out and get them and there’s only one way to do that—through romance.”

Notes

[1] Wired > “Microsoft’s Big Win in Quantum Computing Was an ‘Error’ After All” (Feb 12. 2021)

Quantum computers are built from devices called qubits that encode 1s and 0s of data but can also use a quantum state called a superposition to perform math tricks not possible for the bits in a conventional computer. The main challenge to commercializing that idea is that quantum states are delicate and easily quashed by thermal or electromagnetic noise, making qubits error-prone [qubits’ flakiness].

Google, IBM, and Intel have all shown off prototype quantum processors with around 50 qubits, and companies including Goldman Sachs and Merck are testing the technology. But thousands or millions of qubits are likely required for useful work. Much of a quantum computer’s power would probably have to be dedicated to correcting its own glitches.

Microsoft has taken a different approach, claiming qubits based on Majorana particles will be more scalable, allowing it to leap ahead. But after more than a decade of work, it does not have a single qubit.

Levels of understanding – what are X-rays?

7 thoughts on “Quantum computing explained – 5 levels of difficulty

  1. This MIT Technology Review article discusses the mismatch between visions of quantum computing – theoretical (so-called “in principle”) demonstrations, research funding, company stock market valuations, and claims of commercial applications – and current realities. And uses analogies to early 1900’s vacuum tubes, the first transistor in 1947, and the Wright brothers’ 1903 Wright Flyer.

    • MIT Technology Review > “Quantum computing has a hype problem” by Sankar Das Sarma [1] (March 28, 2022) – Quantum computing startups are all the rage, but it’s unclear if they’ll be able to produce anything of use in the near future.

    As a buzzword, quantum computing probably ranks only below AI in terms of hype. Large tech companies such as Alphabet, Amazon, and Microsoft now have substantial research and development efforts in quantum computing. A host of startups have sprung up as well, some boasting staggering valuations. IonQ, for example, was valued at $2 billion when it went public in October through a special-purpose acquisition company. Much of this commercial activity has happened with baffling speed over the past three years.

    The qubit systems we have today are a tremendous scientific achievement, but they take us no closer to having a quantum computer that can solve a problem that anybody cares about.

    … “entanglement” and “superposition” are not magic wands that we can shake and expect to transform technology in the near future.

    Terms

    • Prime factorization
    • RSA-based cryptography
    • Quantum-error correction
    • Decoherence
    • Stable qubit
    • Topological quantum computing
    • NISQ – “noisy intermediate scale quantum” computer
    • Time crystals

    Notes

    [1] In his article, physicist Sankar Das Sarma, director of the Condensed Matter Theory Center at the University of Maryland, College Park, notes:

    I am as pro-quantum-computing as one can be: I’ve published more than 100 technical papers on the subject, and many of my PhD students and postdoctoral fellows are now well-known quantum computing practitioners all over the world.

  2. Although somewhat off topic for this post on Quantum computing, quantum sensors also rely on quantum error correction (QEC).

    And the timing of (or delay between) “sensing” and QEC actions: “… different speeds in information acquisition … result in a distortion of the output.” A refined mathematical model of such QEC-induced bias might permit rectification [tweaking] in post-processing.

    • Phys.org > “The side effects of quantum error correction and how to cope with them” by Andreas Trabesinger, ETH Zurich (April 6, 2022)

    Quantum systems can interact with one another and with their surroundings in ways that are fundamentally different from those of their classical counterparts. In a quantum sensor [e.g., for sensing the strength of a magnetic and electric field in which it is immersed], … its sensitivity can surpass what is possible, even in principle, with conventional, classical technologies.

    Unfortunately, quantum sensors are exquisitely sensitive not only to the physical quantities of interest, but also to noise. One way to suppress these unwanted contributions is to apply schemes collectively known as quantum error correction (QEC).

    Depending on the length of this delay time [between “sensing” and QEC actions], the dynamics of the quantum system, which should ideally be governed by the Hamiltonian operator alone, becomes contaminated by interference by the error operators.

    QEC-induced bias
    (image caption)

    The dynamics of a quantum sensor. Errors such as noise cause a damping of the signal relative to the ideal case. Quantum error correction recovers substantial parts of the lost signal strength, but also shifts the sensing frequency, leading to the progressive build-up of a bias (shown as gray bars). Adapted from Rojkov et al. Phys. Rev. Lett. 128, 140503 (2022).

  3. Here’s an interesting recap of the history of quantum computing and an outlook for cutting-edge quantum research.

    • Caltech Magazine > “Quantum’s Hub: The Institute for Quantum Information and Matter (IQIM)” by Omar Shamout (Summer 2022) – Includes video by John Preskill, Caltech’s Richard P. Feynman Professor of Theoretical Physics.

    As a sign of quantum computing’s progression at Caltech and beyond, the Institute partnered with Amazon to build the AWS Center for Quantum Computing, which opened on campus last year. The goal of the collaboration is to create quantum computers and related technologies that have the potential to revolutionize data security, machine learning, medicine development, sustainability practices, and more.

    Quantum computer frame
    Photo: Graham Carlow/IBM

  4. [Logged 1-30-2023]

    Here’s a brief article about the state of quantum computing.

    • Wired > “Quantum Computing Has a Noise Problem” by Amit Katwala (Jan 17, 2023) – Today’s devices can be thrown off by the slightest environmental interference.

    Tech giants including Google, Microsoft, and IBM are racing to build quantum devices, but collectively the field is mired in an era known in the business as “noisy-intermediate scale quantum,” or NISQ. Today’s quantum computers are delicate devices that can be thrown off course by the slightest environmental interference: They’re slow, small-scale, and not that accurate, which means that right now they’re kind of useless.

  5. This Wired article claims to discuss: “Everything you ever wanted to know about qubits, superpositioning, and spooky action at a distance.” Some historical recap (with a timeline jargon). Levels of understanding. Practical uses. Additional references.

    So, here’s a “belly flop into the murky shallows of quantum computing 0.101.”

    • Wired > “The WIRED Guide to Quantum Computing” by Tom Simonite & Sophia Chen (Feb 22, 2023) – It’s not productive (or polite) to ask people working on quantum computing when exactly those dreamy applications will become real.

  6. Some progress in creating practical quantum bits.

    • Phys.org > “Doubling a qubit’s life, researchers prove a key theory of quantum physics” by Yale University (March 23, 2023) – Michael Devoret’s group [applied physics lab] has managed to more than double the lifetime of quantum information – their error-corrected qubit lived for 1.8 milliseconds.

    Quantum error correction, which was theoretically discovered in 1995, offers a means to combat … decoherence. Employing redundancy, it protects the quantum bit of information by encoding it in a system larger than what, in principle, is needed to represent a single qubit.

    This larger system, however, makes … the encoded qubit even more fragile. … this process had never been able to definitively extend the lifetime of a quantum bit in practice. … Contrary to theoretical promises, in most experiments, error correction accelerates the decoherence of quantum information.

    “For the first time, we have shown that making the system more redundant and actively detecting and correcting quantum errors provided a gain in the resilience of quantum information,” Devoret said. “Our experiment shows that quantum error correction is … more than just a proof-of-principle demonstration.”

    Volodymyr Sivak, the lead author of the paper [March 22 in Nature], said that this performance was achieved in part by employing a machine-learning agent that tweaked the error correction process to improve the outcome.

    Bloch sphere diagram
    Bloch sphere, a geometrical representation of a qubit – a two-level quantum system. The probability amplitudes for the superposition state (ψ) are given by complex-number equations using angles θ and φ. Image credit: Wiki.

  7. Like floating soap bubbles in the wind
    An analogy for fleeting quantum coherence – like floating soap bubbles in the wind

    Here’s a useful recap of the current state of play in the technology. The fleeting duration of coherence time always is fascinating. And the trade-offs in the approaches.

    • YouTube > Sabine Hossenfelder > “Quantum Computing with Light – The Breakthrough?

    DESCRIPTION

    What if we could harness the power of photons to process information? We can! It’s called photonic computing. It’s one of the new approaches to quantum computing – and it’s looking more and more likely that it could be the key to make quantum computers work.

    It’s not the only newcomer. In this video, we’ll take a look at photonic computing and two other newcomers: optical tweezers and topological quantum computing. Maybe the breakthrough for quantum tech is just around the corner.

    00:00 Intro
    00:22 Quantum Computing Recap
    02:12 Front Runners
    06:23 Newcomer #1: Photons
    10:20 Newcomer #2: Atoms in Tweezers
    12:48 Newcomer #3: Topological States
    15:40 Summary

Comments are closed.