logo
ResearchBunny Logo
Abstract
This work provides a comprehensive study of generalization performance in Quantum Machine Learning (QML) with limited training data. The authors show that generalization error scales at worst as T/N (where T is the number of trainable gates and N is the number of training data points), improving to K/N when only K << T gates have significantly changed during optimization. This implies significant speedups for applications like unitary compiling. The study also demonstrates the effectiveness of QML for quantum state classification and highlights potential applications in quantum error correction and dynamical simulation.
Publisher
Nature Communications
Published On
Aug 22, 2022
Authors
Matthias C. Caro, Hsin-Yuan Huang, M. Cerezo, Kunal Sharma, Andrew Sornborger, Lukasz Cincio, Patrick J. Coles
Tags
Quantum Machine Learning
generalization performance
training data
optimization
quantum applications
unitary compiling
quantum state classification
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs—just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny