As billions of dollars flood into quantum computing and countries develop communication networks protected by quantum encryption, the importance of quantum information science grows.
The Breakthrough Prize in Fundamental Physics this year recognizes four pioneers who integrated math, computer science, and physics to perform “foundational work in the realm of quantum information.” The prize is shared by IBM’s Charles Bennett, the University of Montreal’s Gilles Brassard, the University of Oxford’s David Deutsch, and the Massachusetts Institute of Technology’s Peter Shor.
“These four individuals made significant contributions to the development of quantum information theory,” says Nicolas Gisin, an experimental quantum physicist at the University of Geneva. “It’s good to realize that these prizes are becoming more personal to me.”
The Breakthrough Prizes were co-founded in 2012 by Israeli-Russian entrepreneur and physicist Yuri Milner, and they have received generous support from other business titans, including co-founders Mark Zuckerberg and Sergey Brin. Milner’s historical financial ties to the Kremlin have received criticism, similar to Alfred Nobel, whose Nobel Prize-funding income stemmed from his invention of dynamite. This is especially relevant given Russia’s continuing invasion of Ukraine. Milner has previously underlined his independence and support to Ukrainian refugees. Milner moved to the United States in 2014, according to a spokeswoman, and has not been to Russia since.
However, acknowledgment for quantum information science has not always come quickly or with such financial backing. In general, the area is a synthesis of two theories: quantum physics, which describes the paradoxical behavior of the atomic and subatomic worlds, and information theory, which addresses the mathematical and physical boundaries of computation and communication. Its history is more complicated, with intermittent breakthroughs that were frequently neglected by traditional scientific journals.
Stephen Wiesner, a PhD student at Columbia University at the time, invented a new method of storing information with polarized photons in 1968. Wiesner postulated, among other things, that the inherent fragility of quantum states may be leveraged to generate counterfeit-resistant quantum money. Wiesner, who died this year, was unable to publish many of his heady theoretical ideas and was drawn to religion, so he left academia to work as a construction worker in Israel.
Before leaving Columbia, Wiesner shared some of his thoughts with another young researcher. “Stephen Wiesner, one of my roommate’s boyfriends, started telling me about his ‘quantum money,'” Bennett says. “[It] struck me as interesting, but it didn’t appear to be the start of a whole new field.” Bennett met Brassard in the late 1970s, and the two began talking about Wiesner’s money, which they felt would involve the implausible feat of capturing photons with mirrors to construct a quantum banknote.
“Photons are not supposed to stay; they are meant to travel,” Brassard explains his thought process. “What is more natural than communicating when they travel?” The technique suggested by Bennett and Brassard, known as BB84, would usher in the discipline of quantum cryptography. BB84, which was later documented and publicized in Scientific American, allowed two people to exchange communications in complete secrecy. If a third party snooped, they would leave irreversible evidence of their interception, similar to shattering a quantum wax seal.
While Bennett and Brassard were working on quantum cryptography, another radical idea was taking shape: quantum computing. In a now-famous conference at M.I.T. Endicott House in Dedham, Mass., in May 1981, physicist Richard Feynman argued that a computer based on quantum principles could solve problems that a classical computer could not. Deutsch was captivated by the concept even though he did not attend the meeting. “I gradually became more convinced of the connections between computer and physics,” he explains.
Later that year, while talking with Bennett, Deutsch had a major epiphany: the dominant computational theory was based on the erroneous physics—the “classical” mechanics of Isaac Newton and Albert Einstein’s relativistic approach rather than the underlying quantum reality. “So I decided to rethink the theory of computation based on quantum theory rather than classical theory,” Deutsch explains matter-of-factly. “I didn’t expect to learn anything substantially new from it.” I simply anticipated it to be more stringent.” He soon understood, however, that he was describing a completely different type of computer. Even if it obtained the same findings, it did so using quantum mechanics principles.
Deutsch’s novel theory was helpful in bridging the gap between quantum mechanics and information theory. “It made quantum mechanics comprehensible to me in my language of computer science,” says Umesh Vazirani, a computer scientist at UC Berkeley. Later, Deutsch devised, with Australian mathematician Richard Josza, the first algorithm that would be exponentially quicker than classical algorithms—though it didn’t do anything practical.
However, other valuable applications quickly emerged. Artur Ekert, a PhD student at Oxford at the time, proposed E91, a revolutionary quantum cryptography scheme. Many physicists were drawn to the technique because of its elegance and practicality, as well as the fact that it was published in a renowned physics magazine. “It’s a lovely idea.” “It’s a little odd that Ekert isn’t on the list” of this year’s fundamental physics Breakthrough Prize recipients, adds Gisin.
When Bennett, Brassard, Josza, computer science researcher Claude Crépeau, and physicists Asher Peres and William Wootters proposed quantum teleportation two years later, scientists took notice. The novel method would allow one party to send information, such as the outcome of a coin flip, to another via entanglement, a quantum correlation that can link objects such as electrons. Despite popular science-fiction claims, this technology does not enable faster-than-light messaging, but it has significantly increased the possibilities of real-world quantum communications. “That’s the most mind-boggling thought,” says Chao-Yang Lu, a quantum physicist from China’s University of Science and Technology who has assisted in the technique’s implementation from space.
Words like “revolution” are overused to characterize scientific progress, which is usually slow and incremental. However, Shor secretly initiated one in 1994. He had heard Vazirani and Bennett speak while working at AT&T Bell Laboratories. “I started thinking about what helpful things a quantum computer could perform,” he explains. “I figured it was a long shot.” But it was a fascinating subject. So I began working on it. I didn’t tell many people.”
Shor invented an algorithm that could divide numbers into their prime components (for example, 21 = 7 x 3) exponentially faster than any classical algorithm, inspired by the success of previous quantum algorithms with periodic or repetitive jobs. The ramifications were obvious: prime factorization was the foundation of modern cryptography. Finally, quantum computers had a truly transformative practical application. According to Vazirani, Shor’s method “simply made it extremely plain that you needed to drop everything” to work on quantum computing.
Although Shor had discovered a compelling use for a quantum computer, he had not addressed the more difficult problem of how to create one—even in theory. The delicate quantum states that such devices might exploit to outperform classical computing also made them very error-prone. Furthermore, classical computer error correction procedures could not be applied in quantum computers. Undaunted, Shor bet other researchers at a quantum computing conference in Turin, Italy, in 1995, that a quantum computer would factor a 500-digit number faster than a classical computer. (Factoring 500 digits would likely take billions of years even with today’s classical supercomputers.) No one took Shor’s wager, and some suggested a third possibility: that the sun might burn out first.
Quantum computers are plagued by two forms of mistakes: bit errors and phase errors. These inaccuracies are analogous to switching a compass needle from north to south or east to west. Regrettably, fixing bit mistakes exacerbates phase issues and vice versa. In other words, a more precise north bearing results in a less precise east or west bearing. Later in 1995, however, Shor discovered how to combine bit correction and phase correction—a series of operations similar to completing a Rubik’s Cube without changing a completed side. Shor’s algorithm will remain useless until quantum computers grow more powerful (the maximum number factored with the algorithm is 21, therefore classical factoring will remain dominant—for the time being). However, it made quantum computing feasible, if not practical. “That’s when it all became real,” Brassard explains.
All of this effort resulted in fresh perspectives on quantum mechanics and computing. It motivated Deutsch to develop an even more fundamental notion of “constructors,” which he describes as “the set of all physical transformations.” Others are skeptical about further profound revelations arriving from the quantum realm. “Quantum mechanics is incredibly odd, and I don’t think there will ever be a simple method to grasp it,” Shor says. When asked if his work on quantum computing makes the nature of reality easier or more difficult to explain, he quips, “It certainly makes it more mysterious.”
What began as a hobby or eccentric academic pursuit has now grown well beyond many of the field’s pioneers’ wildest dreams. “We never imagined it would ever be practicable.” “It was just a lot of fun to come up with these weird concepts,” adds Brassard. “At some point, we realized we were serious, but no one followed us.” It was aggravating. It’s incredibly wonderful that it’s now being recognized.”