News on the quantum physics grapevine, Frankfurt Institute theoretical physicist Sabine Hossenfelder tells me, is that Google will announce something special next week: Their paper on achieving quantum supremacy, the realization of a quantum computer that outdoes its conventional counterpart. This talk of quantum supremacy may sound familiar as, in mid-September, NASA, which contributed to the paper, inadvertently uploaded, then promptly pulled, an outdated version of the paper from its “Technical Reports” server—but not before it made waves on the web. It’s no wonder why: Google and company claim that their quantum computer can do in 200 seconds what it would take a supercomputer 10,000 years to do.
The crucial difference between a supercomputer and a quantum computer is the way they store information. For the former it’s a matter, as with any conventional computer, of binary bits, 1s and 0s; for the latter, it’s a matter of quantum bits that can assume any arrangement of 0s and 1s. No, this doesn’t mean that a quantum bit can, like Schrodinger’s cat, be two contradictory things at once—both alive and dead or, in this case, both a 0 and a 1. Rather a quantum bit is, as theoretical computer scientist Scott Aaronson helpfully put it on his blog, “Shtetl-Optimized,” a “complex linear combination of this and that,” or 0s and 1s. “Maybe this and maybe that” comes closest to a rough approximation. “You can then define a [quantum computer],” Aaronson says, “as simply a computer that would exploit this new kind of ‘maybe’: the one that was discovered in the 1920s and involves complex numbers and is out there in the universe.”
“Quantum supremacy” is an epoch-making phrase.
Why does exploiting this “maybe” grant quantum computers supremacy? “To give you an idea of how much more a quantum computer can do, think about this: One can simulate a quantum computer on a conventional computer just by numerically solving the equations of quantum mechanics,” Hossenfelder explains, in this YouTube video she made several months ago, in anticipation of the Google paper’s release.
“If you do that,” she goes on, “then the computational burden on the conventional computer increases exponentially with the number of q-bits that you try to simulate. You can do 2 or 4 q-bits on a personal computer. But already with 50 q-bits you need a cluster of supercomputers. Anything beyond 50 or so q-bits cannot presently be calculated, at least not in any reasonable amount of time.”
Google’s quantum computer is a purple chip named Sycamore. In a picture of it in the paper, you can make out a pair of engravings—“Google AI Quantum,” on one side, and “Sycamore” on the other, beneath an engraved sycamore tree. Sycamore was designed to use 54 superconducting qubits, or transmons. I say “designed” because one of the qubits malfunctioned. So the chip, in their experiment comparing its computing speed against that of a “state-of-the-art supercomputer,” used 53, which ended up being fine for the “task of sampling the output of a pseudo-random quantum circuit.” This sort of computation is almost without structure, making it a “suitable choice for benchmarking,” the researchers say, since it renders classical computers rather slow-going. Sycamore’s success, they conclude, “heralds the advent of a much-anticipated computing paradigm.”
“Quantum supremacy,” which Caltech theoretical physicist John Preskill, director of the Institute for Quantum Information and Matter, coined in 2012, is an epoch-making phrase. Preskill meant it as a kind of threshold. “I wanted to emphasize that this is a privileged time in the history of our planet, when information technologies based on principles of quantum physics are ascendant,” he wrote, this month, in Quanta. There is a bit of an asterisk, he seems to say, that should append Google’s result. “The catch, as the Google team acknowledges, is that the problem their machine solved with astounding speed was carefully chosen just for the purpose of demonstrating the quantum computer’s superiority,” he wrote. “It is not otherwise a problem of much practical interest.” That’s how Hossenfelder sees it, too. Today’s quantum computers really seem to be just “new toys for scientists,” she says, because the “generation of random variables that can be used to check quantum supremacy is not good [enough] to actually calculate anything useful.”
Still, Preskill says what Google’s done amounts to a significant step on the quest to practicality. He thought it would be useful to coin an almost-there phrase “for the era that is now dawning”—“noisy intermediate-scale quantum,” or NISQ. “Noisy” means imprecise. Qubits are still too slippery for error-free computing—the longer a quantum computer runs, the more mistakes it racks up, making the result of a calculation unreliable. “Intermediate-scale” means just big enough to do the sort of thing Google did: demonstrate quantum supremacy—beat a supercomputer handily—but too small, on the order of hundreds of qubits, to do anything valuable. So, we’re firmly in the NISQ (pronounced like “risk”) era, which, as Aaronson wrote in his post, “Scott’s Supreme Quantum Supremacy FAQ!”, is “at least a somethingburger!”
It’s nothing to get too excited about yet. “This”—NISQ—“is really a term invented to make investors believe that quantum computing will have practical applications in the next decades or so,” Hossenfelder says. “The trouble with NISQs is that while it is plausible that they soon will be practically feasible, no one knows how to calculate something useful with them.” Perhaps no one ever will. “I am presently quite worried that quantum computing will go the same way as nuclear fusion, that it will remain forever promising but never quite work.”
Google’s researchers, and their NASA collaborators, understandably put a more positive spin on this uncertainty. Quantum computing, they say, is “transitioning” from being merely academic to being the key to unlocking new computational powers. “We are only one creative algorithm away from valuable near-term applications.” Who knows when its time will come.
Brian Gallagher is the editor of Facts So Romantic, the Nautilus blog. Follow him on Twitter @BSGallagher.
WATCH: How quantum mechanics has changed our understanding of information.