このページのリンク

<電子ブック>
Nonlinear Conjugate Gradient Methods for Unconstrained Optimization / by Neculai Andrei
(Springer Optimization and Its Applications. ISSN:19316836 ; 158)

1st ed. 2020.
出版者 Cham : Springer International Publishing : Imprint: Springer
出版年 2020
本文言語 英語
大きさ XXVIII, 498 p. 93 illus., 90 illus. in color : online resource
著者標目 *Andrei, Neculai author
SpringerLink (Online service)
件 名 LCSH:Mathematical optimization
LCSH:Mathematical models
FREE:Optimization
FREE:Mathematical Modeling and Industrial Mathematics
一般注記 1. Introduction -- 2. Linear Conjugate Gradient Algorithm -- 3. General Convergence Results for Nonlinear Conjugate Gradient Methods -- 4. Standard Conjugate Gradient Methods -- 5. Acceleration of Conjugate Gradient Algorithms -- 6. Hybrid and Parameterized Conjugate Gradient Methods -- 7. Conjugate Gradient Methods as Modifications of the Standard Schemes -- 8. Conjugate Gradient Methods Memoryless BFGS Preconditioned -- 9. Three-Term Conjugate Gradient Methods -- 10. Other Conjugate Gradient Methods -- 11. Discussion and Conclusions -- References -- Author Index -- Subject Index
Two approaches are known for solving large-scale unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given. The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics,Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods
HTTP:URL=https://doi.org/10.1007/978-3-030-42950-8
目次/あらすじ

所蔵情報を非表示

電子ブック オンライン 電子ブック


Springer eBooks 9783030429508
電子リソース
EB00226356

書誌詳細を非表示

データ種別 電子ブック
分 類 LCC:QA402.5-402.6
DC23:519.6
書誌ID 4000134880
ISBN 9783030429508

 類似資料