下降(航空)
度量(数据仓库)
计算机科学
梯度下降
猜想
功能(生物学)
随机梯度下降算法
多样性(控制论)
深度学习
算法
人工智能
数学
人工神经网络
数据挖掘
组合数学
生物
工程类
航空航天工程
进化生物学
作者
Venkatesan Guruswami,Gal Kaplun,Yamini Bansal,Tristan Yang,Boaz Barak,Ilya Sutskever
出处
期刊:Cornell University - arXiv
日期:2020-04-30
被引量:163
摘要
We show that a variety of modern deep learning tasks exhibit a double-descent phenomenon where, as we increase model size, performance first gets worse and then gets better. Moreover, we show that double descent occurs not just as a function of model size, but also as a function of the number of training epochs. We unify the above phenomena by defining a new complexity measure we call the effective model complexity, and conjecture a generalized double descent with respect to this measure. Furthermore, our notion of model complexity allows us to identify certain regimes where increasing (even quadrupling) the number of train samples actually hurts test performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI