If the dawn of the era of quantum computing dawned 3 years ago, its rising sun might have drifted behind a cloud. In 2019, Google researchers claimed to have passed a milestone known as quantum supremacy when the Sycamore quantum computer performed in 200 seconds an enigmatic calculation that they said would connect a supercomputer for 10,000 years. Now, scientists in China have performed the computation in a few hours using regular processors. They say that a supercomputer can completely beat the sycamore.

“I think they’re right that if they have access to a big enough supercomputer, they can simulate … the task in a matter of seconds,” says Scott Aaronson, a computer scientist at the University of Texas, Austin. This advance takes some of the brilliance off Google’s claims, says Greg Cooperberg, a mathematician at the University of California, Davis. “Reaching 300 feet from the summit is less exciting than reaching the top.”

However, the promise of quantum computing remains uncompromised, say Cooperberg et al. Sergio Boixo, principal scientist at Google Quantum AI, said in an email that the Google team knows its feature may not hold for long. “In our 2019 paper, we said that classical algorithms would get better,” he said. But, “we don’t think this classical approach can keep up with quantum circuits in 2022 and beyond.”

Sycamore’s solved “problem” is designed to be difficult for a conventional computer, but as easy as possible for a quantum computer, which manipulates qubits that can be set to 0, 1, or – thanks to quantum mechanics – any combination of 0 and 1 in the same the time. Together, Sycamore’s 53-qubit reciprocating circuitry made of superconducting metal can encode any number from 0 to 2^{53} (nearly 9 quadrillion) – or even all at once.

Starting with setting all qubits to 0, the Google researchers applied to individual qubits and pairs a random but fixed set of boolean operations, or gates, over 20 cycles, and then read the qubits. In a crude way, quantum waves representing all possible outputs interfered between qubits, and the gates created interference that boosted some outputs and canceled others. So some had to appear more likely than others. Over millions of trials, a thorny output pattern has emerged.

The Google researchers argued that simulating these interference effects would overwhelm even the Summit, a supercomputer at Oak Ridge National Laboratory, which has 9,216 CPUs and 27,648 faster graphics processing units (GPUs). Researchers at IBM, which developed Summit, quickly responded that if they tapped into every bit of the computer’s available hard drive, it could handle the computation in a few days. Now, Pan Zhang, a statistic physicist at the Institute of Theoretical Physics at the Chinese Academy of Sciences, and his colleagues have shown how to beat the sycamore in a research paper published in physical review messages.

After others, Zhang and his colleagues reformulated the problem as a three-dimensional mathematical matrix called a tensor network. It consists of 20 layers, one for each cycle of gates, with each layer comprising 53 points, one for each qubit. The lines connected the dots to represent the gates, with each gate encoded in a tensor – a two- or four-dimensional grid of complex numbers. Running the simulation and then reduced to basically hitting all tensors. “The advantage of the tensor network method is that we can use multiple GPUs to perform the computations in parallel,” says Chang.

Zhang and his colleagues also relied on a key idea: Sycamore’s calculations were far from accurate, so their calculations didn’t need to be. Sycamore calculated the distribution of the outputs with an estimated accuracy of 0.2% – just enough to distinguish the intensity of the noise-like footprint in the circuit. So Zhang’s team replaced accuracy with speed by cutting some lines in his network and eliminating the corresponding gates. Missing just eight lines made the computation 256 times faster while still maintaining an accuracy of 0.37%.

The researchers calculated the output pattern of a million of 9 quadrillion possible number series, relying on their own innovation to obtain a truly random representative set. The calculation took 15 hours on 512 GPUs and yielded a thorny output. “It’s fair to say that the Google experience was simulated on a conventional computer,” says Dominic Hangletter, a quantum computer scientist at the University of Maryland, College Park. On a supercomputer, the computation takes a few tens of seconds, Chang says — 10 billion times faster than the Google team estimated.

The researchers say this advance highlights the dangers of a quantum computer’s race against a classical computer. “There is an urgent need for better quantitative trials,” Aronson says. Zhang suggests a more practical approach: “We must find some real-world applications to prove the quantum advantage.”

However, the researchers say Google’s offer was not just propaganda. Zhang pointed out that sycamore requires far fewer operations and less power than a supercomputer. If Sycamore had a slightly higher resolution, he says, his team’s simulation could not continue. In the words of Hangleiter, “The Google Experiment has done what it was supposed to do, start this race.”