数学
透视图(图形)
随机微分方程
数学优化
正多边形
随机优化
凸优化
应用数学
数理经济学
几何学
作者
M. Rodrigo,Jalal Fadili,Hédy Attouch
标识
DOI:10.1287/moor.2022.0162
摘要
In this paper, we analyze the global and local behavior of gradient-like flows under stochastic errors toward the aim of solving convex optimization problems with noisy gradient input. We first study the unconstrained differentiable convex case, using a stochastic differential equation where the drift term is minus the gradient of the objective function and the diffusion term is either bounded or square-integrable. In this context, under Lipschitz continuity of the gradient, our first main result shows almost sure convergence of the objective and the trajectory process toward a minimizer of the objective function. We also provide a comprehensive complexity analysis by establishing several new pointwise and ergodic convergence rates in expectation for the convex, strongly convex, and (local) Łojasiewicz case. The last involves a challenging local analysis which requires nontrivial arguments from measure theory. Then, we extend our study to the constrained case and more generally to nonsmooth problems. We show that several of our results have natural extensions obtained by replacing the gradient of the objective function by a cocoercive monotone operator. This makes it possible to obtain similar convergence results for optimization problems with an additively “smooth + nonsmooth” convex structure. Finally, we consider another extension of our results to nonsmooth optimization which is based on the Moreau envelope. Funding: This work was supported by Agence Nationale de la Recherche (ANR) [Grant ANR-20-CE92-0037-01].
科研通智能强力驱动
Strongly Powered by AbleSci AI