生成语法
生成模型
动力学(音乐)
培训(气象学)
人工智能
计算机科学
深度学习
机器学习
计量经济学
心理学
数学
地理
教育学
气象学
作者
Namjoon Suh,Guang Cheng
出处
期刊:Annual review of statistics and its application
[Annual Reviews]
日期:2024-11-21
标识
DOI:10.1146/annurev-statistics-040522-013920
摘要
In this article, we review the literature on statistical theories of neural networks from three perspectives: approximation, training dynamics, and generative models. In the first part, results on excess risks for neural networks are reviewed in the nonparametric framework of regression. These results rely on explicit constructions of neural networks, leading to fast convergence rates of excess risks. Nonetheless, their underlying analysis only applies to the global minimizer in the highly nonconvex landscape of deep neural networks. This motivates us to review the training dynamics of neural networks in the second part. Specifically, we review articles that attempt to answer the question of how a neural network trained via gradient-based methods finds a solution that can generalize well on unseen data. In particular, two well-known paradigms are reviewed: the neural tangent kernel and mean-field paradigms. Last, we review the most recent theoretical advancements in generative models, including generative adversarial networks, diffusion models, and in-context learning in large language models from two of the same perspectives, approximation and training dynamics.
科研通智能强力驱动
Strongly Powered by AbleSci AI