This textbook offers a guided tutorial that reviews the theoretical fundamentals while going through the practical examples used for constructing the computational frame, applied to various real-life models.
Computational Optimization: Success in Practice will lead the readers through the entire process. They will start with the simple calculus examples of fitting data and basics of optimal control methods and end up constructing a multi-component framework for running PDE-constrained optimization. This framework will be assembled piece by piece; the readers may apply this process at the levels of complexity matching their current projects or research needs.
By connecting examples with the theory and discussing the proper "communication" between them, the readers will learn the process of creating a "big house." Moreover, they can use the framework exemplified in the book as the template for their research or course problems – they will know how to change the single "bricks" or add extra "floors" on top of that.
This book is for students, faculty, and researchers.
Features
The main optimization framework builds through the course exercises and centers on MATLAB®.
All other scripts to implement computations for solving optimization problems with various models use only open-source software, e.g., FreeFEM.
All computational steps are platform-independent; readers may freely use Windows, macOS, or Linux systems.
All scripts illustrating every step in building the optimization framework will be available to the readers online.
Each chapter contains problems based on the examples provided in the text and associated scripts. The readers will not need to create the scripts from scratch, but rather modify the codes provided as a supplement to the book.
This book will prove valuable to graduate students of math, computer science, engineering, and all who explore optimization techniques at different levels for educational or research purposes. It will benefit many professionals in academic and industry-related research: professors, researchers, postdoctoral fellows, and the personnel of R&D departments.
Author(s): Vladislav Bukshtynov
Series: Textbooks in Mathematics
Publisher: CRC Press/Chapman & Hall
Year: 2022
Language: English
Pages: 414
City: Boca Raton
Cover
Half Title
Series Page
Title Page
Copyright Page
Dedication
Contents
Preface
Author
Acronyms and Abbreviations
List of Algorithms
1. Introduction to Optimization
1.1. Optimization Models
1.2. General Notations for Optimization Problem
1.3. Data Fitting Examples
1.4. Optimization Fundamentals
1.4.1. Feasibility
1.4.2. Optimality
1.4.3. Convexity
1.5. General Optimization Algorithm
1.5.1. Solving Optimization Problems in Iterations
1.5.2. Termination Criteria
1.6. Convergence
1.7. Homework Problems
2. Minimization Approaches for Functions of One Variable
2.1. Minimizing a Function in 1D
2.2. Bisection Method
2.3. Golden Section Search
2.4. Newton's Method
2.5. Brute-Force Search
2.6. Monte Carlo Method
2.7. Practical Examples
2.8. Computational Analysis for Convergence
2.9. Homework Problems
2.10. Lab Assignment #1: Review Chapters 1–2
3. Generalized Optimization Framework
3.1. Parameter Identification for Least-Square Data Fitting
3.2. Generalized Optimization Framework
3.2.1. Computational Components
3.2.2. Choice of Proper Software
3.3. Choosing and Adjusting Optimization Algorithms
3.4. Visualization and Analysis of Obtained Solutions
3.5. Testing and Dealing with Problems (Debugging)
3.6. TEST Mode for the Gradient-based Framework
3.7. Accuracy and Performance
3.8. Communication within the Framework
3.9. Homework Problems
4. Exploring Optimization Algorithms
4.1. Iterative Optimization Algorithms Revisited
4.2. Gradient-based Strategies: Line Search vs. Trust Region
4.2.1. Line Search
4.2.2. Trust Region
4.2.3. Comparing Strategies
4.3. Heuristic Algorithms
4.4. Particle Swarm Optimization
4.5. Overview of Optimization Algorithms
4.6. Homework Problems
5. Line Search Algorithms
5.1. Local and Global Minimums: Theory Review
5.1.1. Necessary Conditions
5.1.2. Necessary vs. Sufficient Conditions
5.1.3. Existence of Global Minimums
5.2. Selecting Search Direction
5.2.1. Principal Generalization
5.2.2. Some Theory for Convergence
5.2.3. Steepest Descent
5.3. Newton's Method and Newton-based Modifications
5.3.1. Pure Newton's Method
5.3.2. Discretized Newton's Method
5.3.3. Modified Newton's Method
5.3.4. Diagonally Scaled SD
5.3.5. Quasi-Newton Methods
5.3.6. Gauss-Newton Approach
5.4. Conjugate Gradient
5.5. Line Search Performance Comparison
5.6. Homework Problems
5.7. Lab Assignment #2: Review Chapters 3–5
6. Choosing Optimal Step Size
6.1. Overview
6.2. Simple Approaches
6.3. Inexact Line Search
6.3.1. Armijo Rule
6.3.2. Wolfe Conditions
6.3.3. Goldstein Conditions
6.3.4. Backtracking Line Search
6.4. Step Size Performance Comparison
6.5. Advanced Methods for 1D Search
6.6. Brent's Method
6.7. Bracketing-Brent Toolbox in MATLAB
6.7.1. General Description
6.7.2. Initial Bracketing with fn_min_brack.m
6.7.3. Brent Minimization with fn_brent.m
6.7.4. Technicalities for MATLAB Implementation
6.8. Bracketing-Brent Toolbox Performance
6.9. Homework Problems
7. Trust Region and Derivative-Free Methods
7.1. Trust Region Outline
7.2. General Algorithm
7.3. Cauchy Point Calculation
7.4. Improving Cauchy Point by Dogleg Method
7.5. Checking and Tuning Performance
7.6. Exploring Derivative-Free Options
7.7. Homework Problems
7.8. Lab Assignment #3: Review Chapters 6–7
7.9. Midterm Assignment: Review Chapters 1–7
8. Large-Scale and Constrained Optimization
8.1. Generalization of Large-Scale Optimization
8.2. LSO Examples
8.2.1. Solving Systems of Equations
8.2.2. Space-Dependent Parameter Reconstruction
8.2.3. Parameter Identification – Another Example
8.2.4. State-Dependent Parameter Reconstruction
8.3. Improving Performance of LSO
8.4. General Theory for Constrained Optimization
8.5. Lagrange Multiplier Approach
8.6. Lagrange Multiplier vs. Penalization
8.7. Extending Complexity – DE-constrained Optimization
8.8. KKT Optimality Conditions
8.9. Homework Problems
9. ODE-based Optimization
9.1. Fitting Data by ODE-based Optimization
9.2. Derivative vs. Directional Derivative: Review
9.3. Optimize–then–Discretize
9.3.1. Deriving Gradient
9.3.2. Solving Problem
9.4. Discretize–then–Optimize
9.5. Numerical ODE Solvers
9.5.1. Runge–Kutta Integration Methods
9.5.2. Solving ODEs by MATLAB
9.6. Dynamics of Biological Systems by Lotka–Volterra Equations
9.7. Optimization Problem Constrained by LV Model
9.7.1. Statement of Problem
9.7.2. Deriving Gradients
9.7.3. Optimization Algorithm
9.8. Optimization Framework in MATLAB
9.8.1. Benchmark Models for Controls
9.8.2. Analytic vs. Synthetic Measurements
9.8.3. Adjusting Framework and Choosing Parameters
9.8.4. Checking and Improving Quality of Discretized Gradients
9.8.5. Optimization Results
9.9. Homework Problems
10. Implementing Regularization Techniques
10.1. Motivation for Regularization
10.2. Some Regularization Theory
10.2.1. Quadratic Penalty Function Method
10.2.2. Barrier Functions Method
10.2.3. Tikhonov-type Regularization
10.2.4. Gradient Preconditioning
10.2.5. Bounds by Simple Projections and Slack Variables
10.3. Examples of Numerical Implementation
10.3.1. From Theory to Practice
10.3.2. Adjusting Framework
10.3.3. Results and Food for Thought
10.4. Homework Problems
10.5. Lab Assignment #4: Review Chapters 8–10
11. Moving to PDE-based Optimization
11.1. Generalized Problem of Fitting Data
11.2. Practice Example
11.2.1. Elliptic Equation as Governing PDE
11.2.2. Deriving Gradient by Optimize–then–Discretize
11.2.3. Optimization Algorithm
11.3. Solving PDEs in Higher Dimensions by FreeFEM
11.4. Brief Introduction to Finite Elements Method
11.5. Solving PDEs by FreeFEM
11.5.1. Poisson's Equation in 2D
11.5.2. Coding with FreeFEM
11.5.3. Solution Analysis
11.5.4. Technicalities for FreeFEM Coding
11.6. Homework Problems
12. Sharing Multiple Software Environments
12.1. Practice Example and Benchmark Model
12.2. Updating and Tuning Optimization Framework
12.2.1. Overview of Changes
12.2.2. Software Communication
12.2.3. Another Round on Measurements
12.2.4. Evaluating Objectives
12.2.5. Evaluating Gradients
12.2.6. Checking Quality of Discretized Gradients
12.3. Analyzing Optimization Results
12.4. Homework Problems
12.5. Lab Assignment #5: Review Chapters 11–12
Appendix: Review of Math with MATLAB
Bibliography
Index