logo
ResearchBunny Logo
Power-law scaling to assist with key challenges in artificial intelligence

Computer Science

Power-law scaling to assist with key challenges in artificial intelligence

Y. Meir, S. Sardi, et al.

This study reveals how optimized test errors in deep learning diminish as database sizes increase, crucial for swift decision-making. Conducted by a team of researchers at Bar-Ilan University, this research sets a benchmark for assessing training complexity across various machine learning tasks and algorithms.... show more
Abstract
Power-law scaling, a central concept in critical phenomena, is found to be useful in deep learning, where optimized test errors on handwritten digit examples converge as a power-law to zero with database size. For rapid decision making with one training epoch, each example is presented only once to the trained network, the power-law exponent increased with the number of hidden layers. For the largest dataset, the obtained test error was estimated to be in the proximity of state-of-the-art algorithms for large epoch numbers. Power-law scaling assists with key challenges found in current artificial intelligence applications and facilitates an a priori dataset size estimation to achieve a desired test accuracy. It establishes a benchmark for measuring training complexity and a quantitative hierarchy of machine learning tasks and algorithms.
Publisher
Scientific Reports
Published On
Nov 12, 2020
Authors
Yuval Meir, Shira Sardi, Shiri Hodassman, Karin Kisos, Itamar Ben-Noam, Amir Goldental, Ido Kanter
Tags
power-law scaling
deep learning
test errors
database size
training complexity
machine learning
decision-making
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny