Harry Buhrman on quantum algorithms and cryptography

 |   |  Conversations

Harry Buhrman portrait Harry Buhrman is head of the research group ‘Algorithms and Complexity’ at the Centrum Wiskunde & Informatica, Amsterdam, which he joined in 1994. Since 2000 he also has a joint appointment as full professor of computer science at the University of Amsterdam. Buhrman’s research focuses on quantum computing, algorithms, complexity theory, and computational biology. In 2003 he obtained a prestigious Vici award and was coordinator of several national and international projects.

Buhrman is editor of several international journals and is member of various advisory and scientific boards, such as the advisory board of the Institute for Quantum Computing (Waterloo, Canada).

Luke Muehlhauser: In theory, position-based quantum cryptography could ensure that certain tasks could only be performed at certain locations:

For example, a country could send a message in such a way that it can only be deciphered at the location of its embassy in another country.

In Buhrman et al. (2011), you and your co-authors showed that

if collaborating adversaries are allowed to pre-share an arbitrarily large entangled quantum state, then… position-based cryptography… is impossible… in the quantum setting.

On the positive side, we show that for adversaries that are restricted to not share any entangled quantum states, secure position-verification is achievable. Jointly, these results suggest the interesting question whether secure position-verification is possible in the bounded case of a bounded amount of entanglement.

What is the current state of our knowledge on whether secure position-verification is possible given a bounded amount of entanglement?


Harry Buhrman: Classically such schemes will always be insecure, also if one restricts the running time or other resources of the adversaries. Quantumly the situation is more complicated, although we and others have shown that with an exponential amount of entanglement, also quantum schemes can be broken and are insecure. The quest is thus on for schemes that 1) can be easily implemented by the honest parties 2) require unreasonable amounts of entanglement for adversaries to break the scheme. We have seen some partial progress along these lines. For example a paper by Beigi and Koenig shows that there are schemes where a linear amount of entanglement is necessary to break them. It turns out that the question whether such schemes exists and if they do, prove for certain schemes that they are secure, is a very hard problem that we are currently working on. It turns out that the approach we have taken implies, if successful, results in classical complexity theory, and thus are very hard. On the other hand, it may still be possible that there simply do not exist schemes that are secure in this sense.


Luke: Based on your knowledge of the current state of quantum computing and quantum algorithm design, what recommendations to governments or citizens might you make?

E.g. Ronald de Wolf recommended we switch to post-quantum cryptography for sensitive data, especially since “even though there’s no QC yet, the security services… are probably hoovering up RSA-encrypted communications that they store for the time being, waiting for the QC that will allow them to decrypt these messages later.”


Harry: Yes I agree with Ronald completely. Switch as soon as possible to quantum proof security (although we don’t know what exactly is quantum proof, see next paragraph). Indeed, all of our electronic encrypted communication, such as e-mail, telephone conversations, etc. that are encrypted using RSA-like techniques, can not yet be decrypted, but NSA and other organisations collect these encrypted data and wait till the day comes when they can decipher them. For example when a sufficiently large quantum computer is built such data can be decrypted very easily. So if we want to keep our data secure for a reasonable amount of time we should switch to other encryption techniques and standards. This applies more to governments, banks, big companies, than to normal users, but still I feel that a lot of people don’t realise this. The common incorrect way of approaching this is as follows: This quantum computer, if built at all, is still 15 years ahead of us and when that time comes we switch then to more secure encryption. This reasoning is of course incorrect.

So what encryption should we use instead? Problematic is that this requires much more research. However a few encryption schemes have been proposed that appear to be quantum proof. I suggest that we swith to those in combination with good old RSA. So double encryption. This way the encryption security remains at least the same as what it is now, but most likely it is also quantum proof. One caveat of this approach is that encryption and decryption become (much) more inefficient since RSA like techniques are conveniently fast. This is the price one has to pay to be secure well into the future. So one has to make a decision when encrypting. Is it only necessary to be secure in the next year, e.g. when transmitting a credit card number it is mostly fine if it is only secure for a year or two, or do we need security well into the future? In the latter case I advise to switch as soon as possible.


Luke: Are there additional recommendations you can make to governments or citizens, based on your knowledge of the current state of quantum computing and quantum algorithm design?


Harry: Cryptographic implications are the ones that pertain immediately to society. What most people don’t realise is that there are many problems in science and technology related fields, like eg chemistry, biology, physics and medicine design, that are computational very demanding and often impossible to carry out even on the most powerful supercomputers. Quantum computers wil help to solve some of these more efficiently notably when they concern simulation of quantum mechanical systems and properties. So research in QC will probably have an impact there as well.


Luke: From your perspective, what are the most significant factors blocking or slowing greater progress on the theoretical half (not the experimental half) of quantum computing research? In other words, what feasible but difficult changes to the field (or related fields) would most accelerate theoretical quantum computing research?


Harry: That is a difficult question to answer. If I knew what to change I would do it immediately. One of the problems is that we would like to have more quantum algorithms that demonstrate the power of quantum computation. The community has developed several techniques, like quantum Fourier sampling and quantum random walks. We would like to have more of those techniques and ideas. Maybe more important we also need to isolate computational problems or tasks that are amenable to quantum speedup. That is to say where quantum computers outperform classical computers.

Another area that could improve is that of fault tolerance and error correcting codes. A lot of progress has been made but we still don’t know the exact value of how many hardware errors can be tolerated by algorithmic design. This value is called the fault tolerant  threshold. It  is important to know the exact value when  implementing qubits and running quantum  algorithms and protocols on them. We  will need to perform error correction and fault tolerant computation that fight the errors that popup due to imperfections and noise. But if the errors that occur are too big such correction by computation is not possible. Also it will be important to keep the overhead as small as possible.

A third problem is that of few qubit applications. We are beginning to see the demonstration of few qubit computers in the lab, but there are no good problems that could be used to demonstrate the usefulness of small quantum computers that operate on say 30-100 qubits.


Luke: In Buhrman et al. (1997), you and your co-authors listed 6 open problems in computational complexity theory. Have any of them been solved in the intervening time, perhaps at least with relativized results?


Harry: In that paper we listed 6 hypotheses that all are probably false. The goal was and still is to show that they are equivalent to P=NP. We also listed 6 open problems of which two were solved the next year by D. Sivakumar (# 2 and 5 from that list). As far as I know not much progress has been made towards these problems or showing that all the hypothesis are equivalent to P=NP. I think this is indication for how difficult the P versus NP question is. There currently is a new approach to tackling this problem, not the six hypotheses but the P vs NP problem itself. This approach is called geometric complexity and involves algebraic geometry and representation theory. Difficult topics in Mathematics.

There actually is a connection between quantum information and these types of classical questions. The language of quantum information allows one to state some of these problems in a different language which allows one to approach the problem from a different angle. In general this has been quite successful. For example the result of Ronald de Wolf and co-authors which uses quantum communication at its core, or the paper I have with Jop Briet, Troy Lee and Thomas Vidick which settles a problem in Banach space theory by solving a problem in quantum information. There also is a survey about other such results.

Perhaps we should revisit the 6 hypotheses paper with quantum techniques and see if we can make a little progress. Though at the moment I don’t see a clear plan of attack.


Luke: Thanks, Harry!