A modern, up-to-date introduction to optimization theory and methods This authoritative book serves as an introductory text to optimization at the senior undergraduate and beginning graduate levels. With consistently accessible and elementary treatment of all topics, An Introduction to Optimization, Second Edition helps students build a solid working knowledge of the field, including unconstrained optimization, linear programming, and constrained optimization. Supplemented with more than one hundred tables and illustrations, an extensive bibliography, and numerous worked examples to illustrate both theory and algorithms, this book also provides: * A review of the required mathematical background material * A mathematical discussion at a level accessible to MBA and business students * A treatment of both linear and nonlinear programming * An introduction to recent developments, including neural networks, genetic algorithms, and interior-point methods * A chapter on the use of descent algorithms for the training of feedforward neural networks * Exercise problems after every chapter, many new to this edition * MATLAB(r) exercises and examples * Accompanying Instructor's Solutions Manual available on request An Introduction to Optimization, Second Edition helps students prepare for the advanced topics and technological developments that lie ahead. It is also a useful book for researchers and professionals in mathematics, electrical engineering, economics, statistics, and business. An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.
Author(s): Edwin K. P. Chong, Stanislaw H. Zak
Series: Wiley-Interscience series in discrete mathematics and optimization
Edition: 2
Publisher: Wiley-Interscience
Year: 2001
Language: English
Pages: 495
Cover......Page 1
Contents......Page 8
Preface......Page 14
Part I Mathematical Review......Page 18
1.1 Methods of Proof......Page 20
1.2 Notation......Page 22
Exercises......Page 23
2.1 Real Vector Spaces......Page 24
2.2 Rank of a Matrix......Page 29
2.3 Linear Equations......Page 33
2.4 Inner Products and Norms......Page 35
Exercises......Page 38
3.1 Linear Transformations......Page 40
3.2 Eigenvalues and Eigenvectors......Page 41
3.3 Orthogonal Projections......Page 44
3.4 Quadratic Forms......Page 45
3.5 Matrix Norms......Page 50
Exercises......Page 54
4.2 Hyperplanes and Linear Varieties......Page 58
4.3 Convex Sets......Page 61
4.4 Neighborhoods......Page 63
4.5 Poly topes and Polyhedra......Page 64
Exercises......Page 66
5.1 Sequences and Limits......Page 68
5.2 Differentiability......Page 74
5.3 The Derivative Matrix......Page 76
5.4 Differentiation Rules......Page 78
5.5 Level Sets and Gradients......Page 79
5.6 Taylor Series......Page 83
Exercises......Page 87
Part II Unconstrained Optimization......Page 90
6.1 Introduction......Page 92
6.2 Conditions for Local Minimizers......Page 94
Exercises......Page 102
7.1 Golden Section Search......Page 110
7.2 Fibonacci Search......Page 114
7.3 Newton's Method......Page 122
7.4 Secant Method......Page 125
7.5 Remarks on Line Search Methods......Page 127
Exercises......Page 128
8.1 Introduction......Page 132
8.2 The Method of Steepest Descent......Page 134
8.3.1 Convergence......Page 141
8.3.2 Convergence Rate......Page 148
Exercises......Page 153
9.1 Introduction......Page 158
9.2 Analysis of Newton's Method......Page 161
9.3 Levenberg-Marquardt Modification......Page 164
9.4 Newton's Method for Nonlinear Least-Squares......Page 165
Exercises......Page 168
10.1 Introduction......Page 170
10.2 The Conjugate Direction Algorithm......Page 172
10.3 The Conjugate Gradient Algorithm......Page 177
10.4 The Conjugate Gradient Algorithm for Non-Quadratic Problems......Page 180
Exercises......Page 183
11.1 Introduction......Page 186
11.2 Approximating the Inverse Hessian......Page 187
11.3 The Rank One Correction Formula......Page 190
11.4 The DFP Algorithm......Page 195
11.5 The BFGS Algorithm......Page 199
Exercises......Page 203
12.1 Least-Squares Analysis......Page 206
12.2 Recursive Least-Squares Algorithm......Page 215
12.3 Solution to Ax = b Minimizing ||x||......Page 218
12.4 Kaczmarz's Algorithm......Page 220
12.5 Solving Ax = b in General......Page 223
Exercises......Page 231
13.1 Introduction......Page 238
13.2 Single-Neuron Training......Page 240
13.3 Backpropagation Algorithm......Page 243
Exercises......Page 253
14.1 Basic Description......Page 256
14.1.2 Selection and Evolution......Page 257
14.2 Analysis of Genetic Algorithms......Page 262
14.3 Real-Number Genetic Algorithms......Page 267
Exercises......Page 269
Part III Linear Programming......Page 272
15.1 A Brief History of Linear Programming......Page 274
15.2 Simple Examples of Linear Programs......Page 276
15.3 Two-Dimensional Linear Programs......Page 282
15.4 Convex Polyhedra and Linear Programming......Page 283
15.5 Standard Form Linear Programs......Page 286
15.6 Basic Solutions......Page 291
15.7 Properties of Basic Solutions......Page 295
15.8 A Geometric View of Linear Programs......Page 298
Exercises......Page 301
16.1 Solving Linear Equations Using Row Operations......Page 306
16.2 The Canonical Augmented Matrix......Page 313
16.3 Updating the Augmented Matrix......Page 314
16.4 The Simplex Algorithm......Page 316
16.5 Matrix Form of the Simplex Method......Page 322
16.6 The Two-Phase Simplex Method......Page 326
16.7 The Revised Simplex Method......Page 329
Exercises......Page 334
17.1 Dual Linear Programs......Page 340
17.2 Properties of Dual Problems......Page 347
Exercises......Page 352
18.1 Introduction......Page 358
18.2 Khachiyan's Method......Page 359
18.3.1 Basic Algorithm......Page 362
18.3.2 Two-Phase Method......Page 366
18.4.1 Basic Ideas......Page 367
18.4.2 Karmarkar's Canonical Form......Page 368
18.4.3 Karmarkar's Restricted Problem......Page 370
18.4.4 From General Form to Karmarkar's Canonical Form......Page 371
18.4.5 The Algorithm......Page 375
Exercises......Page 379
Part IV Nonlinear Constrained Optimization......Page 382
19.1 Introduction......Page 384
19.2 Problem Formulation......Page 385
19.3 Tangent and Normal Spaces......Page 387
19.4 Lagrange Condition......Page 393
19.5 Second-Order Conditions......Page 403
19.6 Minimizing Quadratics Subject to Linear Constraints......Page 406
Exercises......Page 410
20.1 Karush-Kuhn-Tucker Condition......Page 416
20.2 Second-Order Conditions......Page 425
Exercises......Page 429
21.1 Introduction......Page 436
21.2 Convex Functions......Page 438
21.3 Convex Optimization Problems......Page 446
Exercises......Page 452
22.2 Projections......Page 458
22.3 Projected Gradient Methods......Page 460
22.4 Penalty Methods......Page 464
Exercises......Page 470
References......Page 474
Index......Page 481