A selection of the most important recent news, articles, and papers about quantum computing.
News and Articles
A breakthrough on the edge: One step closer to topological quantum computing
(Wednesday, July 10, 2024) “Researchers at the University of Cologne have achieved a significant breakthrough in quantum materials, potentially setting the stage for advancements in topological superconductivity and robust quantum computing / publication in ‘Nature Physics’”
Partnership boosts UK access to most powerful quantum technologies – UKRI
(Thursday, July 11, 2024) “UK industry and researchers will gain unparalleled access to the world’s most powerful quantum computers.”
Bob Sutor; Vice President and Practice Lead, Emerging Technologies, The Futurum Group will speak at IQT Quantum + AI in New York City October 29-30 – Inside Quantum Technology
(Friday, July 12, 2024) “Bob Sutor; Vice President and Practice Lead, Emerging Technologies, The Futurum Group will speak at IQT Quantum + AI in New York City October 29-30. Dr. Bob Sutor has been a technical leader and executive in the IT industry for over 40 years. He is a theoretical mathematician by training, with a Ph.D. from Princeton”
Technical Papers and Preprints
[2406.17653] Algorithmic Fault Tolerance for Fast Quantum Computing
(Tuesday, June 25, 2024) “Fast, reliable logical operations are essential for the realization of useful quantum computers, as they are required to implement practical quantum algorithms at large scale. By redundantly encoding logical qubits into many physical qubits and using syndrome measurements to detect and subsequently correct errors, one can achieve very low logical error rates. However, for most practical quantum error correcting (QEC) codes such as the surface code, it is generally believed that due to syndrome extraction errors, multiple extraction rounds — on the order of the code distance d — are required for fault-tolerant computation. Here, we show that contrary to this common belief, fault-tolerant logical operations can be performed with constant time overhead for a broad class of QEC codes, including the surface code with magic state inputs and feed-forward operations, to achieve “algorithmic fault tolerance”. Through the combination of transversal operations and novel strategies for correlated decoding, despite only having access to partial syndrome information, we prove that the deviation from the ideal measurement result distribution can be made exponentially small in the code distance. We supplement this proof with circuit-level simulations in a range of relevant settings, demonstrating the fault tolerance and competitive performance of our approach. Our work sheds new light on the theory of fault tolerance, potentially reducing the space-time cost of practical fault-tolerant quantum computation by orders of magnitude.”
[2407.02553] Large-scale quantum reservoir learning with an analog quantum computer
(Tuesday, July 02, 2024) “Quantum machine learning has gained considerable attention as quantum technology advances, presenting a promising approach for efficiently learning complex data patterns. Despite this promise, most contemporary quantum methods require significant resources for variational parameter optimization and face issues with vanishing gradients, leading to experiments that are either limited in scale or lack potential for quantum advantage. To address this, we develop a general-purpose, gradient-free, and scalable quantum reservoir learning algorithm that harnesses the quantum dynamics of neutral-atom analog quantum computers to process data. We experimentally implement the algorithm, achieving competitive performance across various categories of machine learning tasks, including binary and multi-class classification, as well as timeseries prediction. Effective and improving learning is observed with increasing system sizes of up to 108 qubits, demonstrating the largest quantum machine learning experiment to date. We further observe comparative quantum kernel advantage in learning tasks by constructing synthetic datasets based on the geometric differences between generated quantum and classical data kernels. Our findings demonstrate the potential of utilizing classically intractable quantum correlations for effective machine learning. We expect these results to stimulate further extensions to different quantum hardware and machine learning paradigms, including early fault-tolerant hardware and generative machine learning tasks.”
[2407.07202] Quantum Approximate Optimization: A Computational Intelligence Perspective
(Tuesday, July 09, 2024) “Quantum computing is an emerging field on the multidisciplinary interface between physics, engineering, and computer science with the potential to make a large impact on computational intelligence (CI). The aim of this paper is to introduce quantum approximate optimization methods to the CI community because of direct relevance to solving combinatorial problems. We introduce quantum computing and variational quantum algorithms (VQAs). VQAs are an effective method for the near-term implementation of quantum solutions on noisy intermediate-scale quantum (NISQ) devices with less reliable qubits and early-stage error correction. Then, we explain Farhi et al.’s quantum approximate optimization algorithm (Farhi’s QAOA, to prevent confusion). This VQA is generalized by Hadfield et al. to the quantum alternating operator ansatz (QAOA), which is a nature-inspired (particularly, adiabatic) quantum metaheuristic for approximately solving combinatorial optimization problems on gate-based quantum computers. We discuss connections of QAOA to relevant domains, such as computational learning theory and genetic algorithms, discussing current techniques and known results regarding hybrid quantum-classical intelligence systems. We present a schematic of how QAOA is constructed, and also discuss how CI techniques can be used to improve QAOA. We conclude with QAOA implementations for the well-known maximum cut, maximum bisection, and traveling salesperson problems, which can serve as templates for CI practitioners interested in using QAOA.”
[2407.07694] Scalable, high-fidelity all-electronic control of trapped-ion qubits
(Wednesday, July 10, 2024) “The central challenge of quantum computing is implementing high-fidelity quantum gates at scale. However, many existing approaches to qubit control suffer from a scale-performance trade-off, impeding progress towards the creation of useful devices. Here, we present a vision for an electronically controlled trapped-ion quantum computer that alleviates this bottleneck. Our architecture utilizes shared current-carrying traces and local tuning electrodes in a microfabricated chip to perform quantum gates with low noise and crosstalk regardless of device size. To verify our approach, we experimentally demonstrate low-noise site-selective single- and two-qubit gates in a seven-zone ion trap that can control up to 10 qubits. We implement electronic single-qubit gates with 99.99916(7
Share this:
Call for papers: Education, Research, and Application of Quantum Computing – HICSS 2022
My IBM Quantum colleague Dr. Andrew Wack and I are hosting a minitrack at the Hawaii International Conference on System Sciences (HICSS) 2022.
The description of the minitrack is:
There is no question that quantum computing will be a technology that will spur breakthroughs in natural science, AI, and computational algorithms such as those used in finance. IBM, Google, Honeywell, and several startups are working hard to create the next generation of “supercomputers” based on universal quantum technology.
What exactly is quantum computing, how does it work, how do we teach it, how do we leverage it in education and research, and what will it take to achieve these quantum breakthroughs?
The purpose of this minitrack is to bring together educators and researchers who are working to bring quantum computing into the mainstream.
We are looking for reports that
- improve our understanding of how to integrate quantum computing into business, machine learning, computer science, and applied mathematics university curriculums,
- describe hands-on student experiences with the open-source Qiskit quantum software development kit, and
- extend computational techniques for business, finance, and economics from classical to quantum systems.
It is part of the Decision Analytics and Service Science track at HICSS.
Please consider submitting a report and sharing this Call for Papers with your colleagues.
Share this:
Some practical things you can do to learn about quantum computing
People often ask me “Where should I get started in order to learn about quantum computing?”. Here are several steps you can take. I work for IBM, so things I link to will often be to the IBM Quantum program. Also, I acknowledge that several of the links and videos toward the beginning involve me, but we’ll get through those quickly.
Watch some introductory videos
If you only watch one video, watch this one from WIRED with Talia Gershon:
This one with me is from early 2019 and discussed the IBM Q System One:
Finally, this video from CNBC with Professor Scott Aaronson of the University of Texas Austin, Martin Reynolds of Gartner, and me brings things up to date in January, 2020. Note that I personally do not support many of the statements about “Quantum Supremacy” (horrible label, supercomputers do have massive amounts of storage, off-by-15-million-percent math error):
Get a book
If you are really just getting started and want to systematically work through the required math at an easy and conversational pace, my book Dancing with Qubits should prepare you for more advanced material and give you a start to reading research papers. (Shameless self-plug.)
If you are a hard core physics and/or computer science person, you want to have Quantum Computation and Quantum Information: 10th Anniversary Edition 10th Anniversary ed. Edition by Michael A. Nielsen and Issac L. Chuang in your library. It’s a little old by now, but if you want to end up doing quantum computing research, you will likely have to become very familiar and comfortable with the contents. Other books to consider are Quantum Computing: A Gentle Introduction (good on algorithms, “gentle” is subjective!) and Quantum Computing for Computer Scientists (a bit dated and make sure you get a copy of the errata).
Play a game
Hello Quantum is available for Apple iOS and Android and will teach you the basics of how quantum gates and circuits work.
Build and run circuits with a real quantum computer
Quantum simulators have their place for basic education, experimentation, and debugging. Note, though, that a quantum simulator is to real quantum computer hardware as a TV console flight simulator is to a real plane. If you want a job as a pilot, I would prefer you knew how to fly an actual airplane.
The easiest way to get started without writing code is with the IBM Quantum Composer within the IBM Quantum Experience.
The IBM Quantum Experience has over 200,000 registered users, so you’ll be joining a very large community of beginner, intermediate, and advanced users.
Learn Python
If you are going to write quantum computing code, learn Python. As I write this, the latest version is 3.8. You want Python 3, not Python 2.
Learn Jupyter Notebooks
This is the modern way of developing full documents with interactive code, executions, graphics, videos, and visualizations. It’s used within the IBM Quantum Experience but also many other computational and AI applications. You are mainly interested in how to use it through a browser, not how to run and maintain the console.
Website (introductory): Introduction to Jupyter Notebooks
Write quantum computing code in Qiskit
Qiskit is the leading open source platform for developing quantum computing code and applications. It’s available on Github and available under the Apache 2,0 license. It’s had over 300,000 downloads but I’m recommending you use it through your browser on the IBM Cloud. As with the Composer, it is available through the IBM Quantum Experience.
Whether you want to download Qiskit or use it online, the easiest way to get get started is to watch the series of videos by Abe Asfaw.
From there, you can watch the other videos and also learn about the Qiskit Community.
At this point you are ready to work your way through the online open source Learn Quantum Computing through Qiskit.
Share this:
Dancing With Qubits, First Edition: Let me preface my remarks with …
Way back in 1992, Springer-Verlag published my first book Axiom: The Scientific Computation System, co-authored with the late Richard D. Jenks. Since then I’ve thought of writing other books, but work and life in general caused enough inertia that I never got around to it.
I first got involved with IBM’s quantum computing effort in early 2016. By 2018, I was again thinking of writing a book and this subject was an obvious candidate. How would I start? What would I say? What was my perspective on the topic given that there were already some excellent books?
To write a book, you have to start writing. This is obvious, but no less true and important. In the summer of 2018, I started writing what I thought would be the introduction to the book. My perspective was, and is, very much from the mathematical and computer science directions. To be clear, I am not a physicist. If I could produce a coherent introduction to what I thought the book would cover, I might convince myself that it would be worth the hundreds of hours it would take to complete the project.
When I recently announced that the book was available for pre-order, my industry colleague Jason Bloomberg asked:
“So where does it fall on the spectrum between ‘totally accurate yet completely impenetrable’ and ‘approachable by normal humans but a complete whitewash’?”
I responded:
“I bring you along … to give you the underlying science of quantum computing so you can then read the “totally accurate but formally impenetrable” texts.”
I decided that I would cover the basic math necessary to understand quantum computing, and then get into quantum bits (qubits), gates, circuits, and algorithms. Although readers with the necessary background (or perhaps a good memory of that background) can skip the mathematical fundamentals, I decided to take people through the algebra and geometry of complex numbers, linear algebra, and probability necessary to understand what qubits are and what you can do with them.
That early draft of the book’s introduction described roughly 15 chapters divided into three parts. The final book has 12 chapters and 2 parts. That introduction eventually became the Preface. Part III eventually became Chapter 1.
It’s much tighter than what I imagined it would be, but there is still material I could have covered. There’s a natural tendency to want to add more and more, but I kept asking myself “What is this book about? How deeply do I want to go? Am I getting off track? Will I ever finish?”.
As 2018 went on, I kept tweaking the introduction and I started talking to publishers. In November, I started writing what was then the first chapter. Although I started in Microsoft Word, which is overwhelmingly the format of choice for many publishers, I quickly switched to LaTeX. This produced a far more beautiful book, but also placed constraints on how I could publish the book.
With this as teaser, in future entries I’ll talk more about the writing process, choices I made, LaTeX packages I used and macros I wrote, deciding how to publish the book, and working with editors. Once the book is available, I’ll talk about the specific content and why I included what I did.
Next: Last minute tweaks to my quantum computing book cover
In December, 2019, Packt Publishing published my book Dancing with Qubits: How quantum computing works and how it can change the world. Through a series of blog entries, I talk about the writing and publishing process, and then about the content. |