线性判别分析
趋同(经济学)
跟踪(心理语言学)
迭代法
维数之咒
单调函数
算法
数学证明
计算机科学
数学优化
数学
统计
哲学
语言学
数学分析
几何学
经济
经济增长
作者
Qiaolin Ye,Jie Yang,Hao Zheng,Liyong Fu
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:: 1-4
标识
DOI:10.1109/tnnls.2024.3355422
摘要
Linear discriminant analysis (LDA) may yield an inexact solution by transforming a trace ratio problem into a corresponding ratio trace problem. Most recently, optimal dimensionality LDA (ODLDA) and trace ratio LDA (TRLDA) have been developed to overcome this problem. As one of the greatest contributions, the two methods design efficient iterative algorithms to derive an optimal solution. However, the theoretical evidence for the convergence of these algorithms has not yet been provided, which renders the theory of ODLDA and TRLDA incomplete. In this correspondence, we present some rigorously theoretical insight into the convergence of the iterative algorithms. To be specific, we first demonstrate the existence of lower bounds for the objective functions in both ODLDA and TRLDA, and then establish proofs that the objective functions are monotonically decreasing under the iterative frameworks. Based on the findings, we disclose the convergence of the iterative algorithms finally.
科研通智能强力驱动
Strongly Powered by AbleSci AI