Frédéric Magniez is Visiting Professor at the annual Computer Sciences and Digital Sciences Chair at the Collège de France, 2020-2021.
The quantum computing ecosystem is emerging
Interview with Frédéric Magniez
In 2019, Google announced that it had achieved what American physicist John Preskill dubbed "quantum supremacy". With its Sycamore quantum processor, the American giant succeeded in solving a problem that no conventional computer could have solved in a reasonable amount of time. Although the problem had no real application in theory, this technological breakthrough has made a strong impression on the physics and computing communities. However, this prototype is still far from being powerful enough to be considered a quantum computer. That hasn't stopped theoretical computer scientists from anticipating what might happen with the arrival of such a machine, by dreaming up new algorithmic ideas. In France, Frédéric Magniez is a leader in this research. A graduate of ENS Cachan with a degree in mathematics and a doctorate in theoretical computer science, he has headed the Institut de recherche en informatique fondamentale (IRIF) since 2018. His work focuses on the design and analysis of probabilistic algorithms for processing large masses of data, as well as on the development of quantum computing and, more specifically, cryptography and its interactions with physics. He holds the annual Computer Sciences and Digital Sciences Chair at the Collège de France, dedicated this year to quantum algorithms.
Your lectures in this year's Chair of Computer Science at the Collège de France focus on quantum computing. Can you tell us more about this field of research ?
Frédéric Magniez : When new computer architectures emerge - for example, a supercomputer - a new computing model emerges, and the community of theoretical computer scientists, of which I am a member, tries to understand its implications. In particular, we're wondering whether these machines can be programmed, and whether algorithms can be developed to take advantage of the specific features of this new architecture. But we're also wondering what this means in terms of complexity and resources. For example, can a previously difficult problem be made simple ? Can computation be accelerated ? In the case of quantum computing, we're asking all these questions with a very special architecture : a quantum computer.
What exactly is a quantum computer ?
It's a machine that relies on the laws of quantum mechanics to store and manipulate information. Unlike a conventional computer, which manipulates bits of information (0s or 1s), a quantum computer manipulates what are known as quantum bits - or qubits. These can be in the state 0 or 1, but also in a superposition of these states. For certain types of problem, this property of quantum mechanics can significantly accelerate computing speed. This is the case, for example, with factoring, the inverse operation of multiplication. While it's very simple to multiply two very large integers, factoring, which consists in decomposing a very large number into the product of its factors, is much more complex. So complex, in fact, that our conventional computers are incapable of performing this calculation in any reasonable time. With a quantum computer, however, this calculation would be exponentially faster. And therefore theoretically possible.
In theory ?
Yes, because a universal quantum computer does not yet exist. Only prototype quantum computers have been developed, notably by Google and IBM. To give an order of magnitude, a classical computer is equipped with processors of several billion bits (0 or 1), whereas quantum prototypes manipulate less than a hundred qubits. Google's Sycamore prototype, for example, has just 53 qubits. What's more, these prototypes are prone to calculation errors, making it impossible to run truly useful quantum algorithms.
What is a quantum algorithm? ?
For a classical computer, writing an algorithm means creating a logic circuit with so-called logic gates [1]. These logic gates perform elementary operations. They receive one or more bits of information as input (0 and 1), and depending on their value, calculate an output : 0 or 1. There is a whole variety of logic gates that perform different types of elementary operations. Similarly, quantum algorithms are based on quantum circuits made up of several quantum logic gates. The difference is that the output is not in the state 0or 1, but in a superposition of these two states.
What does this mean? ?
Mathematically, a quantum superposition is a sum of vectors whose coefficients, called amplitudes, can be negative, which is impossible with probability vectors in the classical case. The resulting mathematics is therefore completely different, since probabilities are negative. In a way, superposition curves the probability space with a different metric. And this particular geometry means that problems can be solved more quickly, simply because they are viewed in a different space.
For example ?
A good example is the Fourier transform, well known to physicists and engineers. Without going into detail, the Fourier transform is a mathematical tool that allows us to look at signals in a different way. It is used in numerous signal processing applications, including audio compression, speech recognition and digital transmission. If you remember oscilloscopes from high school, they are equipped with a button that displays the Fourier transform of a signal. It may seem automatic at first glance, but in reality there are a lot of calculations behind it. However, in quantum computing, there is a logic gate - the Hadamard gate - which, when applied to several quantum bits, calculates the Fourier transform almost instantaneously. And that's only because we're working in a geometric space where it's naturally present. Result : the gain obtained in calculating the Fourier transform with a quantum computer is exponential compared with a classical computer.
The quantum computer doesn't yet exist. Yet you are working on quantum algorithms. How is this possible? ?
It's important to understand that research in theoretical computer science involves asking questions about problems that we don't yet know how to solve, and anticipating those that might arise. In this respect, my research is very much upstream. As far as quantum algorithms are concerned, the quantum computing model has been around since the 1980s (see box). So it's not hard to imagine new algorithms that could take advantage of a quantum computer, even if one doesn't exist. As early as 1995, physicist Peter Shor had the genius to establish a link between the factorization problem I mentioned earlier and the quantum Fourier transform. This will be the subject of one of my lectures at the Collège de France. This idea gave rise to Shor's famous algorithm, which would break most of the encryption protocols we use today.
Which ones ?
In cryptography, there are two types of encryption : secret key encryption and public key encryption. In the former, everyone possesses a secret code that has been exchanged at a prior meeting and that enables a message to be decrypted. In practice, this could be a USB key or a hard disk, for example. This secret-key cryptography may be weakened by quantum algorithms, but it remains secure overall. It's the second family - public-key cryptography - that collapses with quantum algorithms. Indeed, the main encryption protocols, such as the RSA cipher used in digital communications, banking transactions and online authentication, are based on factoring very large numbers into their prime factors. As we have already mentioned, Shor's algorithm would enable a quantum computer to perform this operation.
The stakes are therefore very real...
Yes, Shor's algorithm was the first spectacular application of quantum algorithms, because it would save a potentially exponential amount of time in factoring. I specify " potentially " because, basically, safety is never proven ! There's nothing to say that one day someone won't discover a way of breaking RSA encryption by conventional means. We assume that it's very difficult to do with a conventional computer, and we're right until proven otherwise. Nevertheless, the existence of a quantum computer could pose a threat to computer security. That's why, in 2016, the US National Institute of Standards and Technology (NIST) launched a program dedicated to cryptography known as " post-quantum ".
What is it all about? ?
It's a program launched to improve current cryptographic protocols and discover new ones, so that they can withstand the potential onslaught of a quantum computer. In concrete terms, cryptographers need to know what the best possible quantum attack could look like in the next fifty years, so they can adapt their security systems accordingly. So they come to quantum algorithmists like me and ask us how many qubits, operations and computation time we theoretically need to break this or that security protocol. In this way, we develop theoretical tools that help the cryptography community to imagine and dimension classical security systems that would withstand the attacks of a quantum computer.
Cryptography isn't the only problem you're interested in...
Another aspect of my research is to review all existing algorithmic concepts and try to convert them to quantum in order to estimate the gain. In other words, estimating by what order of magnitude computation could be accelerated. It's a titanic task ! It shows that the gain is not always exponential. For some problems, a quantum algorithm will do no better than its classical analogue. For others, the gain may be polynomial or quadratic. In the latter case, a quantum computer will solve a problem in ten steps, whereas a classical computer will need a hundred.
Can you give an example ?
Take a telephone directory, with contacts listed in alphabetical order. If you're looking for the number of a person whose name you know, you'll quickly come across the right page. This is known as a dichotomous search. The search time increases logarithmically with the size of the directory. It's interesting to note that, for such an application, a quantum computer would do no better than a classical computer. So there's no added value. On the other hand, the reverse process would be accelerated. If I have a number at my disposal and I need to find the corresponding name, I have to do random trials. This can be very time-consuming. With a quantum computer using Grover's algorithm [2], the time saving would be quadratic : this machine would need √nsteps instead of n steps. This procedure is the keystone of many applications, particularly in Combinatorics optimization.
One of your approaches also involves starting from certain complex problems and devising algorithms to solve them. Could you describe this approach for us? ?
The way we go about this search for new algorithmic ideas is as follows : first, we identify a problem known as " difficult ". We often simplify it to the extreme so that only the primary difficulty remains - fundamental. This results in problems with statements that are simplistic at first sight, but not simple to solve. For example, I've been very interested in finding triangles in graphs. Typically, this involves finding a network of three friends on a social network, so that everyone knows everyone else. On the face of it, it sounds simple. And yet, we don't know if the currently known algorithm is the best possible one. So we need to come up with a new algorithmic idea. So we turn the problem on its head, and try to get an intuition. It's a process in which we go back and forth between finding a better solution and proving that we can't find a better one. If you can't find one, there must be a reason for the problem you're trying to demonstrate mathematically. And if we fail to provide this proof, then this failure often highlights an algorithmic flaw in the problem that had not been exploited. This back-and-forth is very often accompanied by new ideas, methods and tools whose impact goes beyond the problem initially studied. Often, a tiny advance - a slightly faster algorithm, for example - will have necessitated the development of a new algorithmic idea, which in turn will benefit other applications. It's really the identification of these difficulties that enables the discipline to move forward.
Wouldn't all this work be in vain if building a quantum computer turned out to be impossible? ?
Many of our colleagues believe that if we can demonstrate that it's impossible to build a quantum computer as powerful as we'd like, it would be a great scientific achievement. It would provide a more complete understanding of quantum physics. From then on, the computational model specific to quantum computers will have to be modified. Adjustments, such as the reduction of quantum memory [3], will be necessary. For some time now, I've been interested in writing algorithms that require little quantum memory. I've realized that if a quantum machine has too little memory, then the computational speed-up completely disappears. It's important to study this, because future technologies may have a few thousand qubits, but not several billion. So they won't be able to use much memory. Other colleagues are working on related subjects and imagining what computers with partially quantum memories could be like. We'd have a conventional hard disk, but we'd be able to read its contents quantum-like, using a Grover-type algorithm. It's another memory model, but one that could inspire physicists for future achievements.
Major manufacturers such as Intel, Microsoft, Google, Atos and IBM have entered the field of quantum computing. How do you see their presence ?
In computing, there has always been a long production chain between research and software development. This was not the case in quantum computing, where we had a handful of theoretical algorithmists on one side, and experimental physicists on the other, carrying out experiments on prototypes of a few dozen quantum bits. These industrialists have enabled us to move much faster than we thought, and have contributed to the creation of an ecosystem that today enables us to set up this production chain. This is important, because it's not people like me who are going to find the first concrete applications for 100 or 1 000 quantum bitmachines. On the other hand, some of my work will undoubtedly give ideas to other people in this ecosystem for discovering these long-awaited applications.
What message would you like to pass on to students interested in quantum computing? ?
In terms of academic opportunities : many universities are opening positions all over the world. It's a discipline where you can easily find a thesis grant and a postdoc. What's more, the skills developed in quantum computing are useful in many other fields. Some of my students have gone into financial mathematics or GAFA (Google, Apple, Facebook and Amazon). It doesn't matter if you don't continue in quantum computing: the skills you acquire are useful in other disciplines ranging from applied mathematics to computer science. What's more, it's a very close-knit, tolerant, open, equal-opportunity and cooperative environment. As I explained earlier, an ecosystem is being created around quantum computing. So there will be job opportunities. And everyone can find their place in this ecosystem.
Interview by Gautier Cariou
When Feynman imagined the quantum computer
In 1981, at the first " Physics and computation "conferenceheld at the Massachusetts Institute of Technology (MIT), the American Nobel Prize winner in physics Richard Feynman wondered whether quantum systems could be simulated by a classical computer. In his view, the complexity of such simulations would require exponential computing time and resources. So he came up with the idea of using the properties of quantum physics to compute differently, in order to reduce this complexity. He came up with the concept of the quantum computer. A few years later, in 1985, British-Israeli physicist David Deutsch defined the notion of a quantum Turing machine. In the early 1990s, physicists Ethan Bernstein, Umesh Vazirani, Peter Shor and Lov Grover devised the first quantum algorithms. The development of quantum algorithms has continued ever since.
Definitions
[1] A logic gate is a computer entity that transforms one piece of binary information into another. A NOT gate, for example, transforms an input bit into its inverse. 1 becomes 0 and 0 becomes 1. There are a variety of logic gates that enable computers to perform a wide range of tasks. These gates are based on electronic components : transistors. The processors in our computers have billions of them. In today's quantum processors, quantum gates are based on other physical systems: trapped ions and atoms, or electrical circuits cooled to near absolute zero.
[2] Invented by Lov Grover in 1996, this is a search algorithm enabling a quantum computer to find one or more elements in a database in √nsteps instead of n steps with a classical approach.
[3] A quantum memory is one of the elements needed to build a quantum computer. It is a device that would enable quantum information to be stored.