共轭梯度法
预处理程序
共轭残差法
数学
共轭梯度法的推导
Krylov子空间
块(置换群论)
算法
应用数学
特征向量
非线性共轭梯度法
理想(伦理)
趋同(经济学)
数学优化
迭代法
计算机科学
梯度下降
物理
哲学
机器学习
认识论
经济
量子力学
人工神经网络
经济增长
几何学
标识
DOI:10.1137/s1064827500366124
摘要
We describe new algorithms of the locally optimal block preconditioned conjugate gradient (LOBPCG) method for symmetric eigenvalue problems, based on a local optimization of a three-term recurrence, and suggest several other new methods. To be able to compare numerically different methods in the class, with different preconditioners, we propose a common system of model tests, using random preconditioners and initial guesses. As the "ideal" control algorithm, we advocate the standard preconditioned conjugate gradient method for finding an eigenvector as an element of the null-space of the corresponding homogeneous system of linear equations under the assumption that the eigenvalue is known. We recommend that every new preconditioned eigensolver be compared with this "ideal" algorithm on our model test problems in terms of the speed of convergence, costs of every iteration, and memory requirements. We provide such comparison for our LOBPCG method. Numerical results establish that our algorithm is practically as efficient as the ``ideal' algorithm when the same preconditioner is used in both methods. We also show numerically that the LOBPCG method provides approximations to first eigenpairs of about the same quality as those by the much more expensive global optimization method on the same generalized block Krylov subspace. We propose a new version of block Davidson's method as a generalization of the LOBPCG method. Finally, direct numerical comparisons with the Jacobi--Davidson method show that our method is more robust and converges almost two times faster.
科研通智能强力驱动
Strongly Powered by AbleSci AI