PDF
\(\newcommand{\footnotename}{footnote}\) \(\def \LWRfootnote {1}\) \(\newcommand {\footnote }[2][\LWRfootnote ]{{}^{\mathrm {#1}}}\) \(\newcommand {\footnotemark }[1][\LWRfootnote ]{{}^{\mathrm {#1}}}\) \(\let \LWRorighspace \hspace \) \(\renewcommand {\hspace }{\ifstar \LWRorighspace \LWRorighspace }\) \(\newcommand {\mathnormal }[1]{{#1}}\) \(\newcommand \ensuremath [1]{#1}\) \(\newcommand {\LWRframebox }[2][]{\fbox {#2}} \newcommand {\framebox }[1][]{\LWRframebox } \) \(\newcommand {\setlength }[2]{}\) \(\newcommand {\addtolength }[2]{}\) \(\newcommand {\setcounter }[2]{}\) \(\newcommand {\addtocounter }[2]{}\) \(\newcommand {\arabic }[1]{}\) \(\newcommand {\number }[1]{}\) \(\newcommand {\noalign }[1]{\text {#1}\notag \\}\) \(\newcommand {\cline }[1]{}\) \(\newcommand {\directlua }[1]{\text {(directlua)}}\) \(\newcommand {\luatexdirectlua }[1]{\text {(directlua)}}\) \(\newcommand {\protect }{}\) \(\def \LWRabsorbnumber #1 {}\) \(\def \LWRabsorbquotenumber "#1 {}\) \(\newcommand {\LWRabsorboption }[1][]{}\) \(\newcommand {\LWRabsorbtwooptions }[1][]{\LWRabsorboption }\) \(\def \mathchar {\ifnextchar "\LWRabsorbquotenumber \LWRabsorbnumber }\) \(\def \mathcode #1={\mathchar }\) \(\let \delcode \mathcode \) \(\let \delimiter \mathchar \) \(\def \oe {\unicode {x0153}}\) \(\def \OE {\unicode {x0152}}\) \(\def \ae {\unicode {x00E6}}\) \(\def \AE {\unicode {x00C6}}\) \(\def \aa {\unicode {x00E5}}\) \(\def \AA {\unicode {x00C5}}\) \(\def \o {\unicode {x00F8}}\) \(\def \O {\unicode {x00D8}}\) \(\def \l {\unicode {x0142}}\) \(\def \L {\unicode {x0141}}\) \(\def \ss {\unicode {x00DF}}\) \(\def \SS {\unicode {x1E9E}}\) \(\def \dag {\unicode {x2020}}\) \(\def \ddag {\unicode {x2021}}\) \(\def \P {\unicode {x00B6}}\) \(\def \copyright {\unicode {x00A9}}\) \(\def \pounds {\unicode {x00A3}}\) \(\let \LWRref \ref \) \(\renewcommand {\ref }{\ifstar \LWRref \LWRref }\) \( \newcommand {\multicolumn }[3]{#3}\) \(\require {textcomp}\)

A block conjugate gradient method with polynomial filters for symmetric eigenvalue problems: practice and global quasi-optimality

Fei Xue, Tianqi Zhang

Abstract

  We propose a new block variant of the preconditioned conjugate gradient (PCG) method for computing the lowest eigenvalues of standard symmetric (\(Av = \lambda v\)) and product eigenvalue problems (\(KMv = \lambda ^2 v\)) [1] that arise, for example, from the Bethe-Salpeter equation. The algorithm combines the well-known strengths of locally optimal PCG [4] and Chebyshev polynomial filter methods [2, 3] to exhibit robust and rapid convergence for computing potentially many lowest eigenvalues. The convergence of the new method is not very sensitive to the quality of the preconditioner or the parameters of the polynomial filter, which is usually critical for achieving good performance of PCG methods and polynomial-based algorithms. Numerical experiments show its improved robustness and runtime compared to several other algorithms, such as the \(M\)-LOBPCG [6], Chebyshev-Davidson [2, 3] and LOBP4DCG [5].

  On the theoretical side, we show that the ideal version of this algorithm (and similar others) exhibits a global quasi-optimality if the starting vector is not far from the eigenvector associated with the lowest eigenvalue: the Rayleigh quotient of the iterates computed by this locally optimal algorithm is close to the one computed by the corresponding globally optimal algorithm in early iterations, until the latter eventually outperforms. Such a behavior is similar to the global optimality of the conjugate gradient method for solving a symmetric positive definite system of linear equations [8, Section 5.1]. This theory provides insight into the competitiveness of the family of locally optimal methods with search directions, such as LOBPCG and thick-restarted Lanczos + \(k\) methods [7].

References