深度学习
人工智能
计算机科学
玻尔兹曼机
重整化群
特征学习
代表(政治)
伊辛模型
限制玻尔兹曼机
方案(数学)
集合(抽象数据类型)
理论计算机科学
泛函重整化群
机器学习
数学
统计物理学
物理
量子力学
数学分析
政治
政治学
法学
程序设计语言
作者
Pankaj Mehta,David J. Schwab
出处
期刊:arXiv: Machine Learning
日期:2014-10-14
被引量:85
摘要
Deep learning is a broad set of techniques that uses multiple layers of representation to automatically learn relevant features directly from structured data. Recently, such techniques have yielded record-breaking results on a diverse set of difficult machine learning tasks in computer vision, speech recognition, and natural language processing. Despite the enormous success of deep learning, relatively little is understood theoretically about why these techniques are so successful at feature learning and compression. Here, we show that deep learning is intimately related to one of the most important and successful techniques in theoretical physics, the renormalization group (RG). RG is an iterative coarse-graining scheme that allows for the extraction of relevant features (i.e. operators) as a physical system is examined at different length scales. We construct an exact mapping from the variational renormalization group, first introduced by Kadanoff, and deep learning architectures based on Restricted Boltzmann Machines (RBMs). We illustrate these ideas using the nearest-neighbor Ising Model in one and two-dimensions. Our results suggests that deep learning algorithms may be employing a generalized RG-like scheme to learn relevant features from data.
科研通智能强力驱动
Strongly Powered by AbleSci AI