A recent paper co-authored by NTT Research Cryptography & Information Security (CIS) Lab Senior Scientist Mark Zhandry provides a new approach to verifying quantum advantage or proving quantumness. Discussed by Dr. Zhandry in a workshop at the Simons Institute for the Theory of Computing on June 13, the paper was described as a “breakthrough” by University of Texas at Austin Professor of Computer Science Scott Aaronson, who also presented at the event. The paper, titled “Verifiable Quantum Advantage without Structure,” was co-authored by Takashi Yamakawa, a distinguished researcher at NTT Social Informatics Laboratories.

The event was a joint reunion workshop of two programs – Quantum Wave in Computing; and Lattices: Algorithms, Complexity, and Cryptography – originally held during the Spring 2020 semester at the Simons Institute, which is located on the campus of UC Berkeley. Immediately before the talk by Dr. Zhandry, who is also an assistant professor computer science at Princeton University, Dr. Aaronson spoke about recent progress in quantum advantage, which is also associated with the term quantum speedup.

Quantum computers are commonly assumed to outpace classical computers, but that assumption is worth examining. First, what kind of quantum computing are we talking about? Second, are they in fact superior and if so, by how much? Can a quantum computer solve a problem in a minute or a second that takes a classical computer a week? Or does it take the classical computer an unfathomably exponential amount of time? And third, can you verify this superiority and do so efficiently?

These questions provided a framework for Dr. Aaronson’s talk, which focused exclusively on *exponential* speedups. At the center of his slide-deck presentation was a Venn diagram dubbed the “Field Guide to Exponential Speedups!” It provides helpful context for understanding the Yamakawa-Zhandry paper. He labeled the top-left set of the diagram “NISQable,” a loose term referring to noisy intermediate-scale quantum computing, which is the kind that we will see in the near term. (In contrast, for example, to fault-tolerant quantum computing, which will not be available for many more years.) The bottom set is labeled “Solidly Beats Classical.” Doubtful exponential speedups happen, and as Aaronson noted: “Classical computers get to fight back.” What counts here are shutouts, or overwhelmingly convincing wins. The top-right set is labeled “Efficiently Verifiable,” which concerns the ability to check the math on whether a given argument for quantum advantage is true.

The interplay of these sets is where things get interesting. Aaronson includes Yamakawa-Zhandry, for instance, in the intersection of “Efficiently Verifiable” and “Solidly Beats Classical.” But this subset, in which he also places work by Peter Shor and Urmila Mahadev, does not overlap with exponential speedups that are NISQable. In other words, these arguments can be readily checked mathematically but not demonstrated practically any time soon.

The other two intersections are also noteworthy. One points to models (Boson Sampling and Random Circuit Sampling) that solidly beat classical and could be plausibly demonstrated on a NISQable or somewhat “noisy” quantum computer within the next five years. But in these examples, verification requires solving an exponentially hard problem. While verification can be performed for the smallest instances, it is intractable for the instances of interest. The other intersection identifies two approaches, quantum approximate optimization algorithm (QAOA) and quantum machine learning, which are NISQable and efficiently verifiable, but not definitive in terms of superiority over classical. No one has yet reached the holy grail at the center of this diagram, including Yamakawa and Zhandry, whose work still qualifies as a breakthrough.

“This is the first time that we’ve seen an exponential quantum speed up for an NP search problem, which only requires a random oracle,” said Aaronson. That last point is key: By only requiring a random oracle, i.e., a theoretical black box that generates random responses to each query, Yamakawa and Zhandry built their problem on unstructured computational assumptions. As such, as Zhandry explained in his talk, it aligns more closely with one-way functions instead of structured ones, such as those found in public key cryptography. This facilitates efficient verification.

The Yamakawa-Zhandry problem itself is to simultaneously solve two problems. One is to find an n-symbol string that is a code word of a given error-correction code. (Their choice of code was a variant of Reed-Solomon, famous for applications in numerous fields, including video processing and telecommunications.) The other is to find an n-symbol string where each symbol is mapped to zero under the random oracle. Each problem separately is easy. But to find a single string of symbols that is both a codeword and maps to zero is much harder, at least classically. “If you’re quantum, you can solve this in polynomial time,” said Zhandry in an interview, “but if you’re classical, at least if you’re in this black-box model, you need exponential time.” On the other hand, given a potential solution, it is simple to verify that it is, indeed a solution, just by checking if it is separately a solution to each of the two problems.

Beyond showing a new way to verify quantum advantage, the Yamakawa-Zhandry paper points to something new about the extent of exponential speedup. “Prior to our work, we did have examples of quantum advantage for NP problems, like factoring or, in the black box setting, period finding. But it turns out that the quantum algorithm underlying all these examples was basically period finding – though showing how to apply period-finding to these examples was often non-trivial,” Zhandry said. “Our paper shows there’s at least a second case. You could optimistically interpret that as [saying] there’s hope that quantum advantage is more widespread than maybe we previously thought.”

At the Simons event, Dr. Zhandry spoke again on June 14, delivering two talks (here and here) on new quantum cryptographic primitives. Focusing on potential applications of the quantum no-cloning principle, he surveyed existing knowledge, interesting open problems, and next steps. He said that even from the theoretical perspective, as far as trying to realize these unclonable cryptographic objects, “we’re still in the relatively early days.” For more on Dr. Zhandry, his background and current areas of research, including post-quantum cryptography, see this recent profile and Q&A.