Convergence Analysis for Nonlinear GMRES
Yunhui He
Department of Mathematics, University of Houston, Houston, USA
Abstract
We are interested in solving the following nonlinear system of equations
\(\seteqnumber{0}{}{0}\)\begin{equation} g(x)=0, \end{equation}
where \(x\in \mathbb {R}^n\) and \(g: \mathbb {R}^n \rightarrow \mathbb {R}^n\).
Consider the following fixed-point iteration
\(\seteqnumber{0}{}{1}\)\begin{equation} x_{k+1}=q(x_k)=x_k-g(x_k). \end{equation}
In practice, the fixed-point iteration converges slowly or even diverges. We seek methods to accelerate it.
Define the k-th residual of the fixed-point iteration as
\(\seteqnumber{0}{}{2}\)\begin{equation} r(x_k)=x_k-q(x_k)=g(x_k). \end{equation}
We revisit the nonlinear generalized minimal residual method (NGMRES) following [2, 1]. NGMRES has been used to accelerate the convergence of a fixed-point iteration, given by Algorithm 1. In practice, we consider the windowed NGMRES, i.e., fixing \(m\), denoted as NGMRES(\(m\)), which is different than restart GMRES.
Algorithm 1: NGMRES with window size \(m\), denoted as (NGMRES(\(m\)))
-
1: Given \(x_0\) and \(m\geq 0\)
-
2: For \(k=1,2,\cdots \) until convergence Do:
-
• set \(m_k=\min \{k,m\}\)
-
• compute
\(\seteqnumber{0}{}{3}\)\begin{equation} \label {eq:xkp1} x_{k+1} = q(x_k) + \sum _{i=0}^{m_k}\beta _i^{(k)} \left (q(x_k)-x_{k-i} \right ), \end{equation}
where \(\beta _i^{(k)}\) is obtained by solving the following least-squares problem
\(\seteqnumber{0}{}{4}\)\begin{equation} \label {eq:min} \min _{\beta _i^{(k)}} \left \|g(q(x_k))+\sum _{i=0}^{m_k} \beta _i^{(k)} \left (g(q(x_k))-g(x_{k-i}) \right ) \right \|_2^2. \end{equation}
EndDo
-
To the best of our knowledge, no convergence analysis exists for NGMRES(\(m\)) when applied to nonlinear problems. In this work, under some standard assumptions used for iterative methods in nonlinear problems, we prove that for general \(m>0\), the residuals of NGMRES(\(m\)) converge r-linearly. For \(m=0\), we prove that the residuals of NGMRES(0) converge q-linearly.
Finally, we present numerical results to demonstrate the performance of the NGMRES(\(m\)) method, and we make a comparison with the well-known Anderson acceleration [3]
References
-
[1] Sterck, Hans De, Steepest descent preconditioning for nonlinear GMRES optimization, Numerical Linear Algebra with Applications, 20(3): 453–471, 2013, Wiley Online Library
-
[2] Sterck, Hans De A nonlinear GMRES optimization algorithm for canonical tensor decomposition, SIAM Journal on Scientific Computing, 34(3): A1351–A1379, 2012, SIAM
-
[3] D. G. Anderson, Iterative procedures for nonlinear integral equations, J. Assoc. Comput. Mach., 12 (1965), pp. 547–560.