Machine Learning Approaches for Debugging a Quantum Computer

Event: Conference

Location: Zoom

25 November 2021, 18.00 – 20.00 (Bucharest time)

Postdoctoral Fellow, Zukunftskolleg, University of Konstanz (Germany)

Join Zoom Meeting

Meeting ID: 828 2096 7469
Passcode: 815904

In the past decades, the mounting evidence that quantum algorithms can solve specific tasks with efficiency beyond the capability of a state-of-the-art classical computer has attracted tremendous interest in the field. A turning point was Shor’s algorithm for prime factorization, a polynomial quantum algorithm solving a problem that is hard for classical computers. A fully functioning all-purpose quantum device would have an enormous impact on our lives, with applications in science, drug discovery, disaster preparedness, space exploration, and environmental sustainability among many others. As a consequence, an increasing number of countries and companies are investing billions of dollars in a race to produce and commercialize the quantum computer. Various physical systems for quantum computation have already been developed, and hybrid quantum algorithms, which aim at solving optimization problems more efficiently, can run on existing noisy intermediate-sized quantum devices. However, a full-size general-purpose quantum computer is still out of reach. One of the difficulties in developing such a device is that as the size and complexity of the quantum computer grow, more sophisticated techniques for calibration and evaluation of their performance are required in order to develop fault-tolerant devices. Quantum state tomography (QST) is a prominent technique for the verification of a quantum computer, which allows for the reconstruction of a given quantum state from measurement data. By providing comprehensive information for a given quantum state, QST is known as the “gold standard” for the verification of a quantum device, however, its computational costs, make it infeasible for a system larger than few qubits. Moreover, it can be time consuming even for small systems, i.e. building blocks of a quantum computer of only one or two qubits. Efficient QST would be an important step to making a general-purpose quantum device possible. One aspect of the efficiency of the QST procedure depends on the choice of the measurement scheme, which determines the number of measurements one needs to do in order to perform the QST. Finding a measurement scheme that minimizes the number of required measurements can be formulated as an optimization problem. My work is focused on applying and developing various optimization and machine learning methods with the goal of finding measurement schemes, which minimize the number of measurements needed. By using prior knowledge of the landscape of potential solutions, such as particular symmetries and invariances, one could improve the exploration of the search space and find the optimal measurement schemes.

This event is organized within the framework of the NetIAS lectures series (Network of European Institutes of Advanced Study) hosted by NEC.