logo
ResearchBunny Logo
Predicting trends in the quality of state-of-the-art neural networks without access to training or testing data

Computer Science

Predicting trends in the quality of state-of-the-art neural networks without access to training or testing data

C. H. Martin, T. (. Peng, et al.

Discover groundbreaking insights from Charles H. Martin, Tongsu (Serena) Peng, and Michael W. Mahoney as they tackle the daunting challenge of evaluating pre-trained neural network models without any access to training data. Their research reveals that power law-based metrics significantly outperform traditional measures in distinguishing model quality and uncovering hidden issues.

00:00
00:00
Playback language: English
Abstract
This paper addresses the challenge of evaluating the quality of pre-trained neural network models without access to training or testing data. The authors conduct a meta-analysis of hundreds of publicly available pre-trained models from computer vision and natural language processing, examining norm-based and power law-based metrics. They find that power law-based metrics are superior in discriminating well-trained from poorly trained models and in identifying model problems not detectable through training/test accuracies alone.
Publisher
Nature Communications
Published On
Jul 05, 2021
Authors
Charles H. Martin, Tongsu (Serena) Peng, Michael W. Mahoney
Tags
neural networks
model evaluation
pre-trained models
power law metrics
computer vision
natural language processing
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny