Two approaches are known for solving large-scale unconstrained optimization problems―the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given.
The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.
Author(s): Neculai Andrei
Series: Springer Optimization and Its Applications 158
Publisher: Springer
Year: 2020
Language: English
Pages: 526
Preface
Contents
List of Figures
List of Tables
List of Algorithms
1 Introduction: Overview of Unconstrained Optimization
1.1 The Problem
1.2 Line Search
1.3 Optimality Conditions for Unconstrained Optimization
1.4 Overview of Unconstrained Optimization Methods
1.4.1 Steepest Descent Method
1.4.2 Newton Method
1.4.3 Quasi-Newton Methods
1.4.4 Modifications of the BFGS Method
1.4.5 Quasi-Newton Methods with Diagonal Updating of the Hessian
1.4.6 Limited-Memory Quasi-Newton Methods
1.4.7 Truncated Newton Methods
1.4.8 Conjugate Gradient Methods
1.4.9 Trust-Region Methods
1.4.10 p-Regularized Methods
1.5 Test Problems and Applications
1.6 Numerical Experiments
2 Linear Conjugate Gradient Algorithm
2.1 Line Search
2.2 Fundamental Property of the Line Search Method with Conjugate Directions
2.3 The Linear Conjugate Gradient Algorithm
2.4 Convergence Rate of the Linear Conjugate Gradient Algorithm
2.5 Comparison of the Convergence Rate of the Linear Conjugate Gradient and of the Steepest Descent
2.6 Preconditioning of the Linear Conjugate Gradient Algorithms
3 General Convergence Results for Nonlinear Conjugate Gradient Methods
3.1 Types of Convergence
3.2 The Concept of Nonlinear Conjugate Gradient
3.3 General Convergence Results for Nonlinear Conjugate Gradient Methods
3.3.1 Convergence Under the Strong Wolfe Line Search
3.3.2 Convergence Under the Standard Wolfe Line Search
3.4 Criticism of the Convergence Results
4 Standard Conjugate Gradient Methods
4.1 Conjugate Gradient Methods with \left\| {g_{k {\,+\,} 1} } \right\|^{2} in the Numerator of \beta_{k}
4.2 Conjugate Gradient Methods with g_{k {\,+\,} 1}^{T} y_{k} in the Numerator of \beta_{k}
4.3 Numerical Study
5 Acceleration of Conjugate Gradient Algorithms
5.1 Standard Wolfe Line Search with Cubic Interpolation
5.2 Acceleration of Nonlinear Conjugate Gradient Algorithms
5.3 Numerical Study
6 Hybrid and Parameterized Conjugate Gradient Methods
6.1 Hybrid Conjugate Gradient Methods Based on the Projection Concept
6.2 Hybrid Conjugate Gradient Methods as Convex Combinations of the Standard Conjugate Gradient Methods
6.3 Parameterized Conjugate Gradient Methods
7 Conjugate Gradient Methods as Modifications of the Standard Schemes
7.1 Conjugate Gradient with Dai and Liao Conjugacy Condition (DL)
7.2 Conjugate Gradient with Guaranteed Descent (CG-DESCENT)
7.3 Conjugate Gradient with Guaranteed Descent and Conjugacy Conditions and a Modified Wolfe Line Search (DESCON)
8 Conjugate Gradient Methods Memoryless BFGS Preconditioned
8.1 Conjugate Gradient Memoryless BFGS Preconditioned (CONMIN)
8.2 Scaling Conjugate Gradient Memoryless BFGS Preconditioned (SCALCG)
8.3 Conjugate Gradient Method Closest to Scaled Memoryless BFGS Search Direction (DK/CGOPT)
8.4 New Conjugate Gradient Algorithms Based on Self-Scaling Memoryless BFGS Updating
9 Three-Term Conjugate Gradient Methods
9.1 A Three-Term Conjugate Gradient Method with Descent and Conjugacy Conditions (TTCG)
9.2 A Three-Term Conjugate Gradient Method with Subspace Minimization (TTS)
9.3 A Three-Term Conjugate Gradient Method with Minimization of One-Parameter Quadratic Model of Minimizing Function (TTDES)
10 Preconditioning of the Nonlinear Conjugate Gradient Algorithms
10.1 Preconditioners Based on Diagonal Approximations to the Hessian
10.2 Criticism of Preconditioning the Nonlinear Conjugate Gradient Algorithms
11 Other Conjugate Gradient Methods
11.1 Eigenvalues Versus Singular Values in Conjugate Gradient Algorithms (CECG and SVCG)
11.2 A Conjugate Gradient Algorithm with Guaranteed Descent and Conjugacy Conditions (CGSYS)
11.3 Combination of Conjugate Gradient with Limited-Memory BFGS Methods
11.4 Conjugate Gradient with Subspace Minimization Based on Regularization Model of the Minimizing Function
12 Discussions, Conclusions, and Large-Scale Optimization
Appendix A Mathematical Review
A.1 Elements of Linear Algebra
Outline placeholder
A.2 Elements of Analysis
A.3 Elements of Topology in the Euclidian Space {{\mathbb{R}}}^{n}
A.4 Elements of Convexity—Convex Sets and Convex Functions
Appendix B UOP: A Collection of 80 Unconstrained Optimization Test Problems
References
Author Index
Subject Index