Notable and Interesting Recent Quantum News, Articles, and Papers for Wednesday, July 17, 2024

A selection of the most important recent news, articles, and papers about quantum computing.


Image of a cube-shaped futuristic quantum computer

News, Articles, and Analyses

Infleqtion Leads the Way with First Quantum Computer Installation at NQCC — Infleqtion

(Tuesday, July 16, 2024) “We’re thrilled to announce the installation of our state-of-the-art neutral atom quantum computer at the National Quantum Computing Centre (NQCC). As the first company to deploy hardware under the NQCC’s quantum computing testbed programme, this milestone showcases our cutting-edge technology and de”

Oxford company unveils ‘pivotal’ quantum computing chip – BBC News

(Tuesday, July 16, 2024) “Oxford Ionics claim to have created the first quantum chip of its kind that could be mass-produced.”

Pritzker announces federal partner for quantum computing campus

(Wednesday, July 17, 2024) “CHICAGO (WCIA) — Illinois’ proposal to create a new quantum computing campus has a new partner with a federal agency. Governor J.B. Pritzker announced the partnership of Defense Advanced Research Projects Agency, part of the U.S. Department of Defense, with Illinois’ quantum computing campus Tuesday. The partnership is named Quantum Proving Ground. “The future of […]”

Technical Papers and Preprints

[2107.02151] Continuous Variable Quantum Algorithms: an Introduction

Authors: Buck, Samantha; Coleman, Robin; Sargsyan, Hayk

arXiv logo(Monday, July 05, 2021) “Quantum computing is usually associated with discrete quantum states and physical quantities possessing discrete eigenvalue spectrum. However, quantum computing in general is any computation accomplished by the exploitation of quantum properties of physical quantities, discrete or otherwise. It has been shown that physical quantities with continuous eigenvalue spectrum can be used for quantum computing as well. Currently, continuous variable quantum computing is a rapidly developing field both theoretically and experimentally. In this pedagogical introduction we present the basic theoretical concepts behind it and the tools for algorithm development. The paper targets readers with discrete quantum computing background, who are new to continuous variable quantum computing.”

Towards quantum enhanced adversarial robustness in machine learning | Nature Machine Intelligence

(Thursday, May 25, 2023) “To fulfil the potential of quantum machine learning for practical applications in the near future, it needs to be robust against adversarial attacks. West and colleagues give an overview of recent developments in quantum adversarial machine learning, and outline key challenges and future research directions to advance the field.”

[2407.02467] Error mitigation with stabilized noise in superconducting quantum processors

Authors: Kim, Youngseok; Govia, Luke C. G.; Dane, Andrew; Berg, Ewout van den; Zajac, David M.; Mitchell, Bradley; Liu, Yinyu; Balakrishnan, Karthik; Keefe, George; Stabile, Adam; Pritchett, Emily; Stehlik, Jiri; Kandala, Abhinav

arXiv logo(Tuesday, July 02, 2024) “Pre-fault tolerant quantum computers have already demonstrated the ability to estimate observable values accurately, at a scale beyond brute-force classical computation. This has been enabled by error mitigation techniques that often rely on a representative model on the device noise. However, learning and maintaining these models is complicated by fluctuations in the noise over unpredictable time scales, for instance, arising from resonant interactions between superconducting qubits and defect two-level systems (TLS). Such interactions affect the stability and uniformity of device performance as a whole, but also affect the noise model accuracy, leading to incorrect observable estimation. Here, we experimentally demonstrate that tuning of the qubit-TLS interactions helps reduce noise instabilities and consequently enables more reliable error-mitigation performance. These experiments provide a controlled platform for studying the performance of error mitigation in the presence of quasi-static noise. We anticipate that the capabilities introduced here will be crucial for the exploration of quantum applications on solid-state processors at non-trivial scales.”

[2407.05178] A typology of quantum algorithms

Authors: Arnault, Pablo; Arrighi, Pablo; Herbert, Steven; Kasnetsi, Evi; Li, Tianyi

arXiv logo(Saturday, July 06, 2024) “We draw the current landscape of quantum algorithms, by classifying about 130 quantum algorithms, according to the fundamental mathematical problems they solve, their real-world applications, the main subroutines they employ, and several other relevant criteria. The primary objectives include revealing trends of algorithms, identifying promising fields for implementations in the NISQ era, and identifying the key algorithmic primitives that power quantum advantage.”

 

Notable and Interesting Recent AI News, Articles, and Papers for Monday, July 15, 2024

A selection of the most important recent news, articles, and papers about AI.


Image of a futuristic AI data center

News, Articles, and Analyses

Developers get by with a little help from AI: Stack Overflow Knows code assistant pulse survey results – Stack Overflow

Gen AI and beyond: Where else to focus now | McKinsey

(Friday, July 12, 2024) “Yes, gen AI can be dazzling. But to deliver value, leaders will have to look beyond center stage.”

Designing for Education with Artificial Intelligence: An Essential Guide for Developers – Office of Educational Technology

“Informing product leads and their teams of innovators, designers, and developers as they work toward safety, security, and trust while creating AI products and services for use in education.”

IBM’s AI, Open-Source Granite Models & Sports Technology – The Futurum Group

Author: Steven Dickens

“Chief Technology Advisor Steven Dickens shares insights on how IBM uses AI to enhance sports, democratizing innovation through open-source.”

Technical Papers and Preprints

[2407.08488] Lynx: An Open Source Hallucination Evaluation Model

Authors: Ravi, Selvan Sunitha; Mielczarek, Bartosz; Kannappan, Anand; Kiela, Douwe; Qian, Rebecca

arXiv logo(Thursday, July 11, 2024) “Retrieval Augmented Generation (RAG) techniques aim to mitigate hallucinations in Large Language Models (LLMs). However, LLMs can still produce information that is unsupported or contradictory to the retrieved contexts. We introduce LYNX, a SOTA hallucination detection LLM that is capable of advanced reasoning on challenging real-world hallucination scenarios. To evaluate LYNX, we present HaluBench, a comprehensive hallucination evaluation benchmark, consisting of 15k samples sourced from various real-world domains. Our experiment results show that LYNX outperforms GPT-4o, Claude-3-Sonnet, and closed and open-source LLM-as-a-judge models on HaluBench. We release LYNX, HaluBench and our evaluation code for public access.”

[2407.08105] Federated Learning and AI Regulation in the European Union: Who is Responsible? — An Interdisciplinary Analysis

Authors: Woisetschläger, Herbert; Mertel, Simon; Krönke, Christoph; Mayer, Ruben; Jacobsen, Hans-Arno

arXiv logo(Thursday, July 11, 2024) “The European Union Artificial Intelligence Act mandates clear stakeholder responsibilities in developing and deploying machine learning applications to avoid substantial fines, prioritizing private and secure data processing with data remaining at its origin. Federated Learning (FL) enables the training of generative AI Models across data siloes, sharing only model parameters while improving data security. Since FL is a cooperative learning paradigm, clients and servers naturally share legal responsibility in the FL pipeline. Our work contributes to clarifying the roles of both parties, explains strategies for shifting responsibilities to the server operator, and points out open technical challenges that we must solve to improve FL’s practical applicability under the EU AI Act.”

 

Notable and Interesting Recent Quantum News, Articles, and Papers for Saturday, July 13, 2024

A selection of the most important recent news, articles, and papers about quantum computing.

Image of a cube-shaped futuristic quantum computer

News and Articles

A breakthrough on the edge: One step closer to topological quantum computing

(Wednesday, July 10, 2024) “Researchers at the University of Cologne have achieved a significant breakthrough in quantum materials, potentially setting the stage for advancements in topological superconductivity and robust quantum computing / publication in ‘Nature Physics’”

Partnership boosts UK access to most powerful quantum technologies – UKRI

(Thursday, July 11, 2024) “UK industry and researchers will gain unparalleled access to the world’s most powerful quantum computers.”

Bob Sutor; Vice President and Practice Lead, Emerging Technologies, The Futurum Group will speak at IQT Quantum + AI in New York City October 29-30 – Inside Quantum Technology

(Friday, July 12, 2024) “Bob Sutor; Vice President and Practice Lead, Emerging Technologies, The Futurum Group will speak at IQT Quantum + AI in New York City October 29-30. Dr. Bob Sutor has been a technical leader and executive in the IT industry for over 40 years. He is a theoretical mathematician by training, with a Ph.D. from Princeton”

Technical Papers and Preprints

[2406.17653] Algorithmic Fault Tolerance for Fast Quantum Computing

arXiv logo(Tuesday, June 25, 2024) “Fast, reliable logical operations are essential for the realization of useful quantum computers, as they are required to implement practical quantum algorithms at large scale. By redundantly encoding logical qubits into many physical qubits and using syndrome measurements to detect and subsequently correct errors, one can achieve very low logical error rates. However, for most practical quantum error correcting (QEC) codes such as the surface code, it is generally believed that due to syndrome extraction errors, multiple extraction rounds — on the order of the code distance d — are required for fault-tolerant computation. Here, we show that contrary to this common belief, fault-tolerant logical operations can be performed with constant time overhead for a broad class of QEC codes, including the surface code with magic state inputs and feed-forward operations, to achieve “algorithmic fault tolerance”. Through the combination of transversal operations and novel strategies for correlated decoding, despite only having access to partial syndrome information, we prove that the deviation from the ideal measurement result distribution can be made exponentially small in the code distance. We supplement this proof with circuit-level simulations in a range of relevant settings, demonstrating the fault tolerance and competitive performance of our approach. Our work sheds new light on the theory of fault tolerance, potentially reducing the space-time cost of practical fault-tolerant quantum computation by orders of magnitude.”

[2407.02553] Large-scale quantum reservoir learning with an analog quantum computer

arXiv logo(Tuesday, July 02, 2024) “Quantum machine learning has gained considerable attention as quantum technology advances, presenting a promising approach for efficiently learning complex data patterns. Despite this promise, most contemporary quantum methods require significant resources for variational parameter optimization and face issues with vanishing gradients, leading to experiments that are either limited in scale or lack potential for quantum advantage. To address this, we develop a general-purpose, gradient-free, and scalable quantum reservoir learning algorithm that harnesses the quantum dynamics of neutral-atom analog quantum computers to process data. We experimentally implement the algorithm, achieving competitive performance across various categories of machine learning tasks, including binary and multi-class classification, as well as timeseries prediction. Effective and improving learning is observed with increasing system sizes of up to 108 qubits, demonstrating the largest quantum machine learning experiment to date. We further observe comparative quantum kernel advantage in learning tasks by constructing synthetic datasets based on the geometric differences between generated quantum and classical data kernels. Our findings demonstrate the potential of utilizing classically intractable quantum correlations for effective machine learning. We expect these results to stimulate further extensions to different quantum hardware and machine learning paradigms, including early fault-tolerant hardware and generative machine learning tasks.”

[2407.07202] Quantum Approximate Optimization: A Computational Intelligence Perspective

arXiv logo(Tuesday, July 09, 2024) “Quantum computing is an emerging field on the multidisciplinary interface between physics, engineering, and computer science with the potential to make a large impact on computational intelligence (CI). The aim of this paper is to introduce quantum approximate optimization methods to the CI community because of direct relevance to solving combinatorial problems. We introduce quantum computing and variational quantum algorithms (VQAs). VQAs are an effective method for the near-term implementation of quantum solutions on noisy intermediate-scale quantum (NISQ) devices with less reliable qubits and early-stage error correction. Then, we explain Farhi et al.’s quantum approximate optimization algorithm (Farhi’s QAOA, to prevent confusion). This VQA is generalized by Hadfield et al. to the quantum alternating operator ansatz (QAOA), which is a nature-inspired (particularly, adiabatic) quantum metaheuristic for approximately solving combinatorial optimization problems on gate-based quantum computers. We discuss connections of QAOA to relevant domains, such as computational learning theory and genetic algorithms, discussing current techniques and known results regarding hybrid quantum-classical intelligence systems. We present a schematic of how QAOA is constructed, and also discuss how CI techniques can be used to improve QAOA. We conclude with QAOA implementations for the well-known maximum cut, maximum bisection, and traveling salesperson problems, which can serve as templates for CI practitioners interested in using QAOA.”

[2407.07694] Scalable, high-fidelity all-electronic control of trapped-ion qubits

arXiv logo(Wednesday, July 10, 2024) “The central challenge of quantum computing is implementing high-fidelity quantum gates at scale. However, many existing approaches to qubit control suffer from a scale-performance trade-off, impeding progress towards the creation of useful devices. Here, we present a vision for an electronically controlled trapped-ion quantum computer that alleviates this bottleneck. Our architecture utilizes shared current-carrying traces and local tuning electrodes in a microfabricated chip to perform quantum gates with low noise and crosstalk regardless of device size. To verify our approach, we experimentally demonstrate low-noise site-selective single- and two-qubit gates in a seven-zone ion trap that can control up to 10 qubits. We implement electronic single-qubit gates with 99.99916(7

 

Data science in my book Dancing with Python

In my blog entry Quantum computing in my book Dancing with Python,” I covered what my book covers related to quantum computing. I also published the entry “Availability of my book Dancing with Python and its table of contents.”

Today, I want to specifically list what I discuss in the book in what I term “an extended definition of data science.” The core chapters are in Part III. Here are their titles, introductions, and chapter tables of contents:

III Advanced Features and Libraries

12 Searching and Changing Text

We represent much of the world’s information as text. Think of all the words in all the digital newspapers, e-books, PDF files, blogs, emails, texts, and social media services such as Twitter and Facebook. Given a block of text, how do we search it to see if some desired information is present? How can we change the text to add formatting or corrections or extract information?

Chapter 4, Stringing You Along, covered Python’s functions and methods. This chapter begins with regular expressions and then proceeds to natural language processing (NLP) basics: how to go from a string of text to some of the meaning contained therein.

12.1 Core string search and replace methods
12.2 Regular expressions
12.3 Introduction to Natural Language Processing
12.4 Summary

13 Creating Plots and Charts

Among mathematicians and computer scientists, it’s said that a picture is worth 210 words. Okay, that’s a bad joke, but it’s one thing to manipulate and compute with data, but quite another to create stunning visualizations that convey useful information.

While there are many ways of building images and charts, Matplotlib is the most widely used Python library for doing so. [MAT] Matplotlib is very flexible and can produce high-quality output for print or digital media. It also has great support for a wide variety of backends
that give you powerful mouse-driven interactivity. Generally speaking, if you have a coding project and you need to visualize numeric information, see if Matplotlib already does what you want. This chapter covers the core functionality of this essential library.

13.1 Function plots
13.2 Bar charts
13.3 Histograms
13.4 Pie charts
13.5 Scatter plots
13.6 Moving to three dimensions
13.7 Summary

14 Analyzing Data

While we can use fancy names like “data science,” “analytics,” and “artificial intelligence” to talk about working with data, sometimes you just want to read, write, and process files containing many rows and columns of information. People have been doing this interactively for years, typically using applications like Microsoft Excel® and online apps like Google Sheets™.

To “programmatically” manipulate data, I mean that we use Python functions and methods. This chapter uses the popular pandas library to create and manipulate these collections of rows and columns, called DataFrames. [PAN] [PCB] We will later introduce other methods in Chapter 15, Learning, Briefly. Before we discuss DataFrames, let’s review some core ideas from statistics.

14.1 Statistics
14.2 Cats and commas
14.3 pandas DataFrames
14.4 Data cleaning
14.5 Statistics with pandas
14.6 Converting categorical data
14.7 Cats by gender in each locality
14.8 Are all tortoiseshell cats female?
14.9 Cats in trees and circles
14.10 Summary

15 Learning, Briefly

Machine learning is not new, but it and its sub-discipline, deep learning, are now being used extensively for many applications in artificial intelligence (AI). There are hundreds of academic and practical coding books about machine learning.

This final chapter introduces machine learning and neural networks primarily through the scikit-learn sklearn module. Consider this a jumping-off point where you can use the Python features you’ve learned in this book to go more deeply into these essential AI areas if they interest you.

15.1 What is machine learning?
15.2 Cats again
15.3 Feature scaling
15.4 Feature selection and reduction
15.5 Clustering
15.6 Classification
15.7 Linear regression
15.8 Concepts of neural networks
15.9 Quantum machine learning
15.10 Summary

This book is an introduction, so my goal is to get you started on a broad range of topics. For example, here are the Python modules and packages discussed or used in each of the four chapters in Part III:

12 Searching and Changing Text: re, flashtext, spacy
13 Creating Plots and Charts: matplotlib, numpy, mpl_toolkits.mplot3d
14 Analyzing Data: pandas, numpy, matplotlib, squarify, matplotlib-venn
15 Learning, Briefly: sklearn, pandas, numpy

I mention in passing in the book several other packages, such as pytorch, as pointers for further exploration. I did not include in the list above standard modules such as math, random, and sys.

Availability of my book Dancing with Python and its table of contents

Cover of book Dancing with Python by Robert S. Sutor
My new book Dancing with Python: Learn Python software development from scratch and get started with quantum computing is now available for purchase from Amazon and Packt Publishing.


Develop skills in Python by implementing exciting algorithms, including mathematical functions, classical searching, data analysis, plotting data, machine learning techniques, and quantum circuits.

Key Features

Learn Python basics to write elegant and efficient code

Create quantum circuits and algorithms using Qiskit and run them on quantum computing hardware and simulators

Delve into Python’s advanced features, including machine learning, analyzing data, and searching


Contributors

About the author
About the reviewer

Contents

List of Figures

Preface

Why did I write this book?
For whom did I write this book?
What does this book cover?
What conventions do I use in this book?
Get in touch

1 Doing the Things That Coders Do

1.1 Data
1.2 Expressions
1.3 Functions
1.4 Libraries
1.5 Collections
1.6 Conditional processing
1.7 Loops
1.8 Exceptions
1.9 Records
10 Contents
1.10 Objects and classes
1.11 Qubits
1.12 Circuits
1.13 Summary

I Getting to Know Python

2 Working with Expressions

2.1 Numbers
2.2 Strings
2.3 Lists
2.4 Variables and assignment
2.5 True and False
2.6 Arithmetic
2.7 String operations
2.8 List operations
2.9 Printing
2.10 Conditionals
2.11 Loops
2.12 Functions
2.13 Summary

3 Collecting Things Together

3.1 The big three
3.2 Lists
3.3 The joy of O(1)
3.4 Tuples
3.5 Comprehensions
3.6 What does “Pythonic” mean?
3.7 Nested comprehensions
3.8 Parallel traverse
3.9 Dictionaries
3.10 Sets
3.11 Summary

4 Stringing You Along

4.1 Single, double, and triple quotes
4.2 Testing for substrings
4.3 Accessing characters
4.4 Creating strings
4.5 Strings and iterations
4.6 Strings and slicing
4.7 String tests
4.8 Splitting and stripping
4.9 Summary

5 Computing and Calculating

5.1 Using Python modules
5.2 Integers
5.3 Floating-point numbers
5.4 Rational numbers
5.5 Complex numbers
5.6 Symbolic computation
5.7 Random numbers
5.8 Quantum randomness
5.9 Summary

6 Defining and Using Functions

6.1 The basic form
6.2 Parameters and arguments
6.3 Naming conventions
6.4 Return values
6.5 Keyword arguments
6.6 Default argument values
6.7 Formatting conventions
6.8 Nested functions
6.9 Variable scope
6.10 Functions are objects
6.11 Anonymous functions
6.12 Recursion
6.13 Summary

7 Organizing Objects into Classes

7.1 Objects
7.2 Classes, methods, and variables
7.3 Object representation
7.4 Magic methods
7.5 Attributes and properties
7.6 Naming conventions and encapsulation
7.7 Commenting Python code
7.8 Documenting Python code
7.9 Enumerations
7.10 More polynomial magic
7.11 Class variables
7.12 Class and static methods
7.13 Inheritance
7.14 Iterators
7.15 Generators
7.16 Objects in collections
7.17 Creating modules
7.18 Summary

8 Working with Files

8.1 Paths and the file system
8.2 Moving around the file system
8.3 Creating and removing directories
8.4 Lists of files and folders
8.5 Names and locations
8.6 Types of files
8.7 Reading and writing files
8.8 Saving and restoring data
8.9 Summary

II Algorithms and Circuits

9 Understanding Gates and Circuits

9.1 The software stack
9.2 Boolean operations and bit logic gates
9.3 Logic circuits
9.4 Simplifying bit expressions
9.5 Universality for bit gates
9.6 Quantum gates and operations
9.7 Quantum circuits
9.8 Universality for quantum gates
9.9 Summary

10 Optimizing and Testing Your Code

10.1 Testing your code
10.2 Timing how long your code takes to run
10.3 Optimizing your code
10.4 Looking for orphan code
10.5 Defining and using decorators
10.6 Summary

11 Searching for the Quantum Improvement

11.1 Classical searching
11.2 Quantum searching via Grover
11.3 Oracles
11.4 Inversion about the mean
11.5 Amplitude amplification
11.6 Searching over two qubits
11.7 Summary

III Advanced Features and Libraries

12 Searching and Changing Text

12.1 Core string search and replace methods
12.2 Regular expressions
12.3 Introduction to Natural Language Processing
12.4 Summary

13 Creating Plots and Charts

13.1 Function plots
13.2 Bar charts
13.3 Histograms
13.4 Pie charts
13.5 Scatter plots
13.6 Moving to three dimensions
13.7 Summary

14 Analyzing Data

14.1 Statistics
14.2 Cats and commas
14.3 pandas DataFrames
14.4 Data cleaning
14.5 Statistics with pandas
14.6 Converting categorical data
14.7 Cats by gender in each locality
14.8 Are all tortoiseshell cats female?
14.9 Cats in trees and circles
14.10 Summary

15 Learning, Briefly

15.1 What is machine learning?
15.2 Cats again
15.3 Feature scaling
15.4 Feature selection and reduction
15.5 Clustering
15.6 Classification
15.7 Linear regression
15.8 Concepts of neural networks
15.9 Quantum machine learning
15.10 Summary

Appendices

A Tools

A.1 The operating system command line
A.2 Installing Python
A.3 Installing Python modules and packages
A.4 Installing a virtual environment
A.5 Installing the Python packages used in this book
A.6 The Python interpreter
A.7 IDLE
A.8 Visual Studio Code
A.9 Jupyter notebooks
A.10 Installing and setting up Qiskit
A.11 The IBM Quantum Composer and Lab
A.12 Linting

B Staying Current

B.1 python.org
B.2 qiskit.org
B.3 Python expert sites
B.4 Asking questions and getting answers

C The Complete UniPoly Class

D The Complete Guitar Class Hierarchy

E Notices

E.1 Photos, images, and diagrams
E.2 Data
E.3 Trademarks
E.4 Python 3 license

F Production Notes

References

Other Books You May Enjoy

Index

Index Formatting Examples
Python function, method, and property index
Python class index
Python module and package index
General index

Call for papers: Education, Research, and Application of Quantum Computing – HICSS 2022

Education, Research, and Application of Quantum Computing

My IBM Quantum colleague Dr. Andrew Wack and I are hosting a minitrack at the Hawaii International Conference on System Sciences (HICSS) 2022.

The description of the minitrack is:

There is no question that quantum computing will be a technology that will spur breakthroughs in natural science, AI, and computational algorithms such as those used in finance. IBM, Google, Honeywell, and several startups are working hard to create the next generation of “supercomputers” based on universal quantum technology.

What exactly is quantum computing, how does it work, how do we teach it, how do we leverage it in education and research, and what will it take to achieve these quantum breakthroughs?

The purpose of this minitrack is to bring together educators and researchers who are working to bring quantum computing into the mainstream.

We are looking for reports that

  • improve our understanding of how to integrate quantum computing into business, machine learning, computer science, and applied mathematics university curriculums,
  • describe hands-on student experiences with the open-source Qiskit quantum software development kit, and
  • extend computational techniques for business, finance, and economics from classical to quantum systems.

It is part of the Decision Analytics and Service Science track at HICSS.

Please consider submitting a report and sharing this Call for Papers with your colleagues.

Dancing with Qubits: Quantum Computing and Finance update

In section 1.5 of my quantum computing book Dancing with Qubits, I discuss potential applications of the technology to financial services. An excellent survey article by my IBM Quantum colleagues is now on arXiv that updates and goes into much greater detail than what I covered.

“Quantum computing for Finance: state of the art and future prospects” by Daniel J. Egger, Claudio Gambella, Jakub Marecek, Scott McFaddin, Martin Mevissen, Rudy Raymond, Andrea Simonetto, Stefan Woerner, and Elena Yndurain has this abstract:

This paper outlines our point of view regarding the applicability, state of the art, and potential of quantum computing for problems in finance. We provide an introduction to quantum computing as well as a survey on problem classes in finance that are computationally challenging classically and for which quantum computing algorithms are promising. In the main part, we describe in detail quantum algorithms for specific applications arising in financial services, such as those involving simulation, optimization, and machine learning problems. In addition, we include demonstrations of quantum algorithms on IBM Quantum back-ends and discuss the potential benefits of quantum algorithms for problems in financial services. We conclude with a summary of technical challenges and future prospects.

I highly recommend it.

IEEE Quantum Week and IBM Quantum

IEEE Quantum Week 2020
This year’s IEEE Quantum Week is planned for October 12-16, 2020, in Denver, Colorado.

IEEE Quantum Week is a multidisciplinary quantum computing venue where attendees will have the unique opportunity to discuss challenges and opportunities with quantum researchers, scientists, engineers, entrepreneurs, developers, students, practitioners, educators, programmers, and newcomers.

Jerry M. Chow, IBM ResearchThe IBM Quantum team is well represented at the conference with a keynote from Jerry Chow. We also have 7 tutorials and 2 workshops.

Our tutorials are on the following topics:

  • Quantum programming, an introduction
  • Quantum machine learning for data scientists
  • Quantum hardware control: a hands-on introduction
  • Quantum algorithms for optimization
  • Quantum algorithms for chemistry simulation
  • Assessing the quality of qubits and quantum computers
  • Serious Games for Quantum Computing

Our workshops are:

  • Control and design of superconducting qubits
  • Software for quantum applications, algorithms, and workflows

Registration is now open.

Verified by MonsterInsights