Hello, and welcome to this week’s edition of the Future in Five Questions. Today I interviewed Jay Gambetta, the vice president of quantum computing at IBM Research. We discussed his vision for a hybrid future of “quantum-centric” supercomputers, the paramount importance of the number zero and how he thinks the federal government can play a bigger role in boosting the quantum research ecosystem. This conversation has been edited and condensed for clarity: What’s one underrated big idea? The difference between a quantum-centric supercomputer and a quantum computer. A lot of people think that when we say quantum-centric supercomputer, it's a synonym for a quantum computer. What we actually mean is an architecture that will take the best of what you get from classical and the best of quantum. My boss [IBM Research director Dario Gil] often describes it as “bits, qubits and neurons” working together, where it is tightly enough integrated that they're not disjoint, but they're not in a single operating system, they're still heterogeneous. What we're seeing from the algorithms is that when you allow the flexibility to optimize over the different modes, you're going to have a lot more possibility. If you have an architecture that allows you to move data between the two with a tight enough integration, you can start to explore how you can do algorithms that can exploit some part of the subroutine on the quantum computer and the rest of the routine on the classical computer, or in the future a GPU. What’s a technology you think is overhyped? I would say quantum, but I would like to put some qualifiers on that. Any time a new technology is created, everyone either assumes the worst, or they abstract away the important math and important things it does to say it's going to solve drugs, or it's going to solve all these other things. Quantum is overhyped for all these applications in the short term, but it's under-hyped in how it's fundamentally changing what it means to compute. The use cases of quantum are overhyped, but the implications of what quantum is are probably under-hyped. What book most shaped your conception of the future? “Zero: The Biography of a Dangerous Idea” by Charles Seife. Math is something that a lot of people just take for granted. This book goes through a history of the number zero, which at first was just a place value. But once you had a place value, imagine a binary computer: Now you if you want to count, you can count 00 and then 0110 and so on. That place value allows you to count and do that tabulation really fast. It then gives examples of how society is impacted by just what zero means. I think the Western world even banned zero when it first came out. But it became so obvious that you needed zero because it was foundational for calculus and differential equations. Then you go to the physics world, and what the author points out in this book is that the three kinds of fundamental physics all have both a zero and infinity, making their equations predict weird stuff. The absolute temperature of zero is what stops you cooling something down and making perpetual motion. The zero in quantum is what gives you the vacuum fluctuations or Heisenberg's uncertainty principle. And the zero in relativity gives you black holes. I find it profound that something as simple as the number zero, that people take for granted, matters so much. What could government be doing regarding technology that it isn’t? The fundamental investment in quantum physics is always good. But as we build these quantum computers, we need to make sure that we lead in building large-scale computers. It would be great if the U.S. made sure that the leading facilities here had the best quantum computers to explore algorithms and keep pushing us forward. When we put the train tracks across the U.S., the government always was pushing the limits of it. Why can't you do something similar in making sure that the government is making sure the best technology is getting built here? Their investments are great, but we want to bring along the computing side as well. What surprised you most in the past year? The error correction paper that we put out last year. [Note: IBM Research scientists working for Gambetta published a paper in the journal Nature last year demonstrating that their quantum computers could perform error correction in a novel way that made those computers more useful.] I was under the impression like everyone else that error correction in the surface code was going to be the best answer. The surface code requires many, many qubits. And this code, having both a high threshold and low efficiency, so not using many qubits inside a single code, I didn't think was possible. I thought, if you wanted to have a higher threshold code so you could handle lots of errors, the surface code was the best answer. There are three aspects of it. A high threshold is one, which means it can handle a lot of errors. Second, that it doesn't require all-to-all connectivity, it's still a 2D plane with every qubit having two extra connectors. I call it just a bit beyond 2D connection. The third is that it's efficient. What enabled it is that the codes are very connected in a way, so the states that are on it have a lot of connections, and so they have complicated states, but they still have low weights, so you don't require measuring out all the qubits. It wasn't a priori obvious that you could get all those three.
|