This new, updated edition of Optimal Control reflects major changes that have occurred in the field in recent years and presents, in a clear and direct way, the fundamentals of optimal control theory. It covers the major topics involving measurement, principles of optimality, dynamic programming, variational methods, Kalman filtering, and other solution techniques. To give the reader a sense of the problems that can arise in a hands-on project, the authors have included new material on optimal output feedback control, a technique used in the aerospace industry. Also included are two new chapters on robust control to provide background in this rapidly growing area of interest. Relations to classical control theory are emphasized throughout the text, and a root-locus approach to steady-state controller design is included. A chapter on optimal control of polynomial systems is designed to give the reader sufficient background for further study in the field of adaptive control.
The authors demonstrate through numerous examples that computer simulations of optimal controllers are easy to implement and help give the reader an intuitive feel for the equations. To help build the reader's confidence in understanding the theory and its practical applications, the authors have provided many opportunities throughout the book for writing simple programs.
Optimal Control will also serve as an invaluable reference for control engineers in the industry. It offers numerous tables that make it easy to find the equations needed to implement optimal controllers for practical applications. All simulations have been performed using MATLAB and relevant Toolboxes.
Optimal Control assumes a background in the state-variable representation of systems; because matrix manipulations are the basic mathematical vehicle of the book, a short review is included in the appendix.
A lucid introductory text and an invaluable reference, Optimal Control will serve as a complete tool for the professional engineer and advanced student alike.
As a superb introductory text and an indispensable reference, this new edition of Optimal Control will serve the needs of both the professional engineer and the advanced student in mechanical, electrical, and aerospace engineering. Its coverage encompasses all the fundamental topics as well as the major changes of recent years, including output-feedback design and robust design. An abundance of computer simulations using MATLAB and relevant Toolboxes is included to give the reader the actual experience of applying the theory to real-world situations. Major topics covered include:
* Static Optimization
* Optimal Control of Discrete-Time Systems
* Optimal Control of Continuous-Time Systems
* The Tracking Problem and Other LQR Extensions
* Final-Time-Free and Constrained Input Control
* Dynamic Programming
* Optimal Control for Polynomial Systems
* Output Feedback and Structured Control
* Robustness and Multivariable Frequency-Domain Techniques
Author(s): Frank L. Lewis, Vassilis L. Syrmos
Edition: 2
Publisher: Wiley-Interscience
Year: 1995
Language: English
Pages: 552