数学优化
单调函数
趋同(经济学)
平滑的
静止点
数学
增广拉格朗日法
序列(生物学)
光学(聚焦)
应用数学
可微函数
计算机科学
数学分析
统计
物理
生物
光学
经济
遗传学
经济增长
作者
Hengmin Zhang,Junbin Gao,Jianjun Qian,Jian Yang,Chunyan Xu,Bob Zhang
出处
期刊:IEEE Transactions on Circuits and Systems for Video Technology
[Institute of Electrical and Electronics Engineers]
日期:2023-07-04
卷期号:34 (2): 828-838
被引量:9
标识
DOI:10.1109/tcsvt.2023.3291821
摘要
In this work, we focus on studying the differentiable relaxations of several linear regression problems, where the original formulations are usually both nonsmooth with one nonconvex term. Unfortunately, in most cases, the standard alternating direction method of multipliers (ADMM) cannot guarantee global convergence when addressing these kinds of problems. To address this issue, by smoothing the convex term and applying a linearization technique before designing the iteration procedures, we employ nonconvex ADMM to optimize challenging nonconvex-convex composite problems. In our theoretical analysis, we prove the boundedness of the generated variable sequence and then guarantee that it converges to a stationary point. Meanwhile, a potential function is derived from the augmented Lagrange function, and we further verify that the objective function is monotonically nonincreasing. Under the Kurdyka-Łojasiewicz (KŁ) property, the global convergence is analyzed step by step. Finally, experiments on face reconstruction, image classification, and subspace clustering tasks are conducted to show the superiority of our algorithms over several state-of-the-art ones.
科研通智能强力驱动
Strongly Powered by AbleSci AI