A block conjugate gradient method with polynomial filters for symmetric eigenvalue problems: practice and global quasi-optimality
Fei Xue, Tianqi Zhang
Abstract
We propose a new block variant of the preconditioned conjugate gradient (PCG) method for computing the lowest eigenvalues of standard symmetric (\(Av = \lambda v\)) and product eigenvalue problems (\(KMv = \lambda ^2 v\)) [1] that arise, for example, from the Bethe-Salpeter equation. The algorithm combines the well-known strengths of locally optimal PCG [4] and Chebyshev polynomial filter methods [2, 3] to exhibit robust and rapid convergence for computing potentially many lowest eigenvalues. The convergence of the new method is not very sensitive to the quality of the preconditioner or the parameters of the polynomial filter, which is usually critical for achieving good performance of PCG methods and polynomial-based algorithms. Numerical experiments show its improved robustness and runtime compared to several other algorithms, such as the \(M\)-LOBPCG [6], Chebyshev-Davidson [2, 3] and LOBP4DCG [5].
On the theoretical side, we show that the ideal version of this algorithm (and similar others) exhibits a global quasi-optimality if the starting vector is not far from the eigenvector associated with the lowest eigenvalue: the Rayleigh quotient of the iterates computed by this locally optimal algorithm is close to the one computed by the corresponding globally optimal algorithm in early iterations, until the latter eventually outperforms. Such a behavior is similar to the global optimality of the conjugate gradient method for solving a symmetric positive definite system of linear equations [8, Section 5.1]. This theory provides insight into the competitiveness of the family of locally optimal methods with search directions, such as LOBPCG and thick-restarted Lanczos + \(k\) methods [7].
References
-
[1] T. Zhang and F. Xue, A Chebyshev locally optimal block preconditioned conjugate gradient method for product and standard symmetric eigenvalue problems, to appear on SIAM. J. Matrix Anal. Appl., 2024.
-
[2] Y. Zhou and Y. Saad, A Chebyshev-Davidson algorithm for large symmetric eigenproblems, SIAM. J. Matrix Anal. Appl., Vol. 29 (2007), pp. 954-971.
-
[3] Y. Zhou, A block Chebyshev-Davidson method with inner-outer restart for large eigenvalue problems, J. Comput. Phys., Vol. 229 (2010), pp. 9188-9200.
-
[4] A. V. Knyazev, Toward the optimal preconditioned eigensolver: locally optimal block preconditioned conjugate gradient method, SIAM. J. Sci. Comput., Vol. 23 (2001), pp. 517-541.
-
[5] Z. Bai and R.-C. Li, Minimization principles for the linear response eigenvalue problem II: Computation, Vol. 34 (2013), pp. 392-416.
-
[6] E. Vecharynski, J. Brabec, M. Shao, N. Govind, and C. Yang, Efficient block preconditioned eigensolvers for linear response time-dependent density functional theory, Comput. Phys. Commun., Vol. 221 (2017), pp. 42-52.
-
[7] L. Wu, F. Xue, and A. Stathopoulos, TRPL+k: Thick-restart preconditioned Lanczos+k method for large symmetric eigenvalue problems, SIAM. J. Sci. Comput., Vol. 41 (2019), pp. A1013-A1040.
-
[8] J. Nocedal and S. J. Wright, Numerical Optimization (2nd edition), Springer, New York, 2006.