This paper investigates the potential quantum advantage in machine learning tasks where data is provided. The authors demonstrate that classically hard problems can be easily predicted by classical machine learning models trained on data. Using rigorous prediction error bounds, they develop a methodology for assessing quantum advantage, showing that classical models can compete with quantum models even on quantum problems. A projected quantum model is proposed to achieve a quantum speed-up in the fault-tolerant regime. Near-term experiments on engineered datasets (up to 30 qubits) demonstrate a significant prediction advantage over classical models.
Publisher
Nature Communications
Published On
May 11, 2021
Authors
Hsin-Yuan Huang, Michael Broughton, Masoud Mohseni, Ryan Babbush, Sergio Boixo, Hartmut Neven, Jarrod R. McClean
Tags
quantum advantage
machine learning
classical models
prediction error
quantum speed-up
fault-tolerant regime
engineered datasets
Related Publications
Explore these studies to deepen your understanding of the subject.