Aug. 24, 2022 — Training a quantum neural network requires only a small amount of data, according to a new proof that upends previous assumptions stemming from classical computing’s huge appetite for data in machine learning, or artificial intelligence. The theorem has several direct applications, including more efficient compiling for quantum computers and distinguishing phases of matter for materials discovery.
“Many people believe that quantum machine learning will require a lot of data. We have rigorously shown that for many relevant problems, this is not the case,” said Lukasz Cincio, a quantum theorist at Los Alamos National Laboratory and co-author of the paper containing the proof published in the journal Nature Communications. “This provides new hope for quantum machine learning. We’re closing the gap between what we have today and what’s needed for quantum advantage, when quantum computers outperform classical computers.”
“The need for large data sets could have been a roadblock to quantum AI, but our work removes this roadblock. While other issues for quantum AI could still exist, at least now we know that the size of the data set is not an issue,” said Patrick Coles, a quantum theorist at the Laboratory and co-author of the paper.
Comments are closed.