- New theoretical research between Los Alamos National Laboratory, Freie Universität Berlin and other researchers from the United States, United Kingdom and Switzerland, proves that machine learning on quantum computers requires much simpler data than previously believed.
- These findings pave the way for maximizing the usefulness of today’s noisy mid-scale quantum computers for simulating quantum systems and other tasks better than classical digital computers, while also offering promise for optimizing quantum sensors.
- The group has developed a theoretical basis for more efficient algorithms, particularly for quantum machine learning, to exploit the capabilities of these noisy machines as the industry works to improve the quality and size of quantum computers.
PRESS RELEASE — Newswise — LOS ALAMOS, NM, July 5, 2023 — New theoretical research proves that machine learning on quantum computers requires much simpler data than previously believed. These findings pave the way for maximizing the usefulness of today’s noisy mid-scale quantum computers for simulating quantum systems and other tasks better than classical digital computers, while also offering promise for optimizing quantum sensors.
“We show that surprisingly simple data in small amounts is sufficient to train a quantum neural network,” said Lukasz Cincio, a quantum theorist at Los Alamos National Laboratory. He is a co-author of paper contains evidence published in the journal Nature Communications. “This work takes another step toward making quantum machine learning easier, more accessible, and more short-term.”
The new paper emerges from a collaboration between the Los Alamos team, lead author Matthias Caro, from the Freie Universität Berlin, and other researchers from the United States, United Kingdom and Switzerland. The group has developed a theoretical basis for more efficient algorithms, particularly for quantum machine learning, to exploit the capabilities of these noisy machines as the industry works to improve the quality and size of quantum computers.
The new research paper builds on previous work by the Los Alamos National Laboratory and its collaborators showing that training a quantum neural network requires very little data. Overall, recent theoretical breakthroughs prove that organizing training with very few and very simple states offers a special approach to doing practical work on today’s finite quantum computers faster than on conventional classical physics-based computers.
“While previous work considered the amount of training data in quantum machine learning, here we focus on type training data,” said Caro. “We proved that a small amount of training data is sufficient even if we limit ourselves to simple types of data.”
“In practical terms, that means you can train a neural network not only on some images of cats, for example, but also on very simple images,” says Cincio. “For quantum simulations, that means you can train in simple states in a quantum manner.”
“Such states are easy to set up, which makes the entire learning algorithm much easier to run on a short-term quantum computer,” said co-author Zoe Holmes, professor of physics at the École Polytechnique Fédérale de Lausanne and former Los Alamos postdoc.
Short term applications for quantum computers
Noise in the form of interactions between quantum bits, or qubits, and their surrounding environment causes errors that limit the processing capabilities of today’s quantum computer technology. Despite the noise, quantum computers excel at certain tasks, such as simulating quantum systems in materials science and classifying quantum states with machine learning.
“If you classify quantum data, then there is a certain amount of noise that you can tolerate and still get the right answer,” says Cincio. “That’s why quantum machine learning can be a great short-term application.”
Quantum machine learning tolerates more noise than other types of algorithms because a task like classification, the heart of machine learning, doesn’t require 100% accuracy to produce useful results, said Andrew T. Sornborger, a co-author of the paper. Sornborger is the Algorithm and Quantum Simulation thrust leader at the Center for Quantum Science. Led by Oak Ridge National Laboratory, the center is a collaboration between national laboratories, including Los Alamos, universities and industry.
The new paper shows that using simpler data allows less complex quantum circuits to prepare certain quantum states on a computer, such as quantum chemical simulations showing the evolution of molecular systems. Simple circuits are easy to implement, quiet, and thus capable of completing calculations. The new Nature Communications paper demonstrates a method for constructing quantum machine learning algorithms using easy-to-setup states.
Off-loading to classic computers
Complex quantum algorithms exceed the processing power of even the very large classical computers. However, the team also found that because their new approach simplifies algorithm development, the compilation of quantum algorithms can be ported to classical computers. Then the compiled algorithm can be run successfully on a quantum computer. This new approach allows programmers to reserve quantum computing resources for tasks they can uniquely perform but which suffocate classical computers, such as simulating quantum systems, while avoiding the long circuit noise that causes errors in quantum computers.
Lab research has applications in the field of developing quantum sensing. Harnessing certain principles from quantum mechanics makes it possible to build very sensitive devices to measure gravitational or magnetic fields, for example.
“Quantum sensing methods in the absence of noise are straightforward and well understood theoretically, but the situation becomes much more complicated when noise is considered,” said Sornborger. “Adding quantum machine learning to quantum sensing protocols allows you to apply this method when the encoding mechanism is unknown or when hardware noise is affecting the quantum probe.” The application of quantum machine learning is being investigated in a Department of Energy-sponsored project led by Lukasz Cincio and Marco Cerezo, also of Los Alamos.
Paper: “Generalizations beyond distributions to study quantum dynamics.” Authors: Matthias C. Caro, Hsin-Yuan Huang, Nicholas Ezzell, Joe Gibbs, Andrew T. Sornborger, Lukasz Cincio, Patrick J. Coles and Zoe Holmes. Nature Communications. DOI: 10.1038/s41467–023–39381-w
Funding: Funding for Los Alamos National Laboratory research was provided by the Laboratory Directed Research and Development program at Los Alamos, the Beyond Moore’s Law project at Los Alamos, and the Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, under the Acceleration of Research in Quantum Computing program.
SOURCE: Los Alamos National Laboratory
Featured image: Using simpler data enables less complex quantum circuits to perform machine learning tasks on quantum computers. Simple circuits are easy to implement, quiet, and thus complete calculations. Credit: Los Alamos National Laboratory