A new edition of the classic text on optimal control theory
As a superb introductory text and an indispensable reference, this new edition of Optimal Control will serve the needs of both the professional engineer and the advanced student in mechanical, electrical, and aerospace engineering. Its coverage encompasses all the fundamental topics as well as the major changes that have occurred in recent years. An abundance of computer simulations using MATLAB and relevant Toolboxes is included to give the reader the actual experience of applying the theory to real-world situations. Major topics covered include:
Static Optimization
Optimal Control of Discrete-Time Systems
Optimal Control of Continuous-Time Systems
The Tracking Problem and Other LQR Extensions
Final-Time-Free and Constrained Input Control
Dynamic Programming
Optimal Control for Polynomial Systems
Output Feedback and Structured Control
Robustness and Multivariable Frequency-Domain Techniques
Differential Games
Reinforcement Learning and Optimal Adaptive Control
Chapter 1 Static Optimization (pages 1–18):
Chapter 2 Optimal Control of Discrete?Time Systems (pages 19–109):
Chapter 3 Optimal Control of Continuous?Time Systems (pages 110–176):
Chapter 4 The Tracking Problem and other LQR Extensions (pages 177–212):
Chapter 5 Final?Time?Free and Constrained Input Control (pages 213–259):
Chapter 6 Dynamic Programming (pages 260–286):
Chapter 7 Optimal Control for Polynomial Systems (pages 287–296):
Chapter 8 Output Feedback and Structured Control (pages 297–354):
Chapter 9 Robustness and Multivariable Frequency?Domain Techniques (pages 355–437):
Chapter 10 Differential Games (pages 438–460):
Chapter 11 Reinforcement Learning and Optimal Adaptive Control (pages 461–517):