甲骨文公司
辛几何
辛积分器
计算机科学
工具箱
启发式
桥接(联网)
范围(计算机科学)
加速度
数学优化
算法
理论计算机科学
机器学习
人工智能
数学
软件工程
程序设计语言
辛流形
几何学
物理
经典力学
计算机网络
作者
Michael Betancourt,Michael I. Jordan,Ashia C. Wilson
出处
期刊:Cornell University - arXiv
日期:2018-02-10
被引量:46
摘要
Accelerated gradient methods have had significant impact in machine learning -- in particular the theoretical side of machine learning -- due to their ability to achieve oracle lower bounds. But their heuristic construction has hindered their full integration into the practical machine-learning algorithmic toolbox, and has limited their scope. In this paper we build on recent work which casts acceleration as a phenomenon best explained in continuous time, and we augment that picture by providing a systematic methodology for converting continuous-time dynamics into discrete-time algorithms while retaining oracle rates. Our framework is based on ideas from Hamiltonian dynamical systems and symplectic integration. These ideas have had major impact in many areas in applied mathematics, but have not yet been seen to have a relationship with optimization.
科研通智能强力驱动
Strongly Powered by AbleSci AI