KIAM Main page Web Library  •  Publication Searh  Русский 

KIAM Preprint № 164, Moscow, 2018
Authors: Oseledets I.V., Botchev M.A., Katrutsa A.M., Ovchinnikov G.V.
How to optimize preconditioners for the conjugate gradient method: a stochastic approach
The conjugate gradient method (CG) is usually used with a preconditioner which improves efficiency and robustness of the method. Many preconditioners include parameters and a proper choice of a preconditioner and its parameters is often not a trivial task. Although many convergence estimates exist which can be used for optimizing preconditioners, they typically hold for all initial guess vectors, reflecting the worst convergence rate. To account for the mean convergence rate instead, in this paper, we follow a simple stochastic approach. It is based on trial runs with random initial guess vectors and leads to a functional which can be used to monitor convergence and to optimize preconditioner parameters in CG. Presented numerical experiments show that optimization of this new functional usually yields a better parameter value than optimization of the functional based on the spectral condition number.
conjugate gradient method, preconditioners, condition number, eigenvalue clustering, relaxed incomplete Cholesky preconditioner
Publication language: russian,  pages: 26
Research direction:
Programming, parallel computing, multimedia
Russian source text:
Export link to publication in format:   RIS    BibTeX
View statistics (updated once a day)
over the last 30 days — 10 (+9), total hit from 01.09.2019 — 257
About authors:
  • Oseledets Ivan Valer'evich,,  Сколковский институт науки и технологий
  • Botchev Mikhail Aleksandrovich, RAS
  • Katrutsa Aleksandr Mikhaylovich,,  Сколковский институт науки и технологий
  • Ovchinnikov Georgiy Victorovich,  ,  Сколковский институт науки и технологий