Control Theory in Physics and other Fields of Science: Concepts, Tools, and Applications

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This book provides an introduction to the analysis and the control mechanism of physical, chemical, biological, technological and economic models and their nonequilibrium evolution dynamics. Strong emphasis is placed on the foundation of variational principles, evolution and control equations, numerical methods, statistical concepts and techniques for solving or estimation of stochastic control problems for systems with a high degree of complexity. In particular, the central aim of this book is developing a synergetic connection between theoretical concepts and real applications. This book is a modern introduction and a helpful tool for researchers as well as for graduate students interested in econophysics and related topics.

Author(s): Michael Schulz
Series: Springer Tracts in Modern Physics
Edition: 1
Publisher: Springer
Year: 2005

Language: English
Pages: 314
City: Berlin

front-matter.pdf......Page 1
1.1 The Aim of Control Theory......Page 18
1.2 Dynamic State of Classical Mechanical Systems......Page 20
1.3 Dynamic State of Complex Systems......Page 23
1.4 The Physical Approach to Control Theory......Page 30
References......Page 31
2.1 Introduction: The Brachistochrone Problem......Page 33
2.2 The Deterministic Control Problem......Page 35
2.3 The Simplest Control Problem: Classical Mechanics......Page 38
2.4 General Optimum Control Problem......Page 49
2.5 The Hamilton--Jacobi Equation......Page 71
References......Page 75
3.1 Introduction to Linear Quadratic Problems......Page 77
3.2 Extensions and Applications......Page 89
3.3 The Optimal Regulator......Page 93
3.4 Control of Linear Oscillations and Relaxations......Page 97
References......Page 106
4.1 Field Equations......Page 109
4.2 Control by External Sources......Page 119
4.3 Control via Boundary Conditions......Page 132
References......Page 134
5.1 Characterization of Trajectories in the Phase Space......Page 138
5.2 Time-Discrete Chaos Control......Page 143
5.3 Time-Continuous Chaos Control......Page 156
References......Page 161
6.1 Statistical Approach to Phase Space Dynamics......Page 164
6.2 The Liouville Equation......Page 167
6.3 Generalized Rate Equations......Page 168
6.4 Notation of Probability Theory......Page 176
6.5 Combined Probabilities......Page 179
6.6 Markov Approximation......Page 182
6.7 Generalized Fokker--Planck Equation......Page 184
6.8 Correlation and Stationarity......Page 191
6.9 Stochastic Equations of Motions......Page 194
References......Page 206
7.1 Markov Diffusion Processes under Control......Page 207
7.2 Optimal Open Loop Control......Page 213
7.3 Feedback Control......Page 218
References......Page 225
8.1 Partial Uncertainty of Controlled Systems......Page 227
8.2 Gaussian Processes......Page 229
8.3 Lévy Processes......Page 237
8.4 Rare Events......Page 242
8.5 Kalman Filter......Page 246
8.6 Filters and Predictors......Page 257
References......Page 275
9.1 Unpredictable Systems......Page 279
9.2 Optimal Control and Decision Theory......Page 281
9.3 Zero-Sum Games......Page 285
9.4 Nonzero-Sum Games......Page 288
References......Page 290
10.1 Notations of Optimization Theory......Page 292
10.2 Optimization Methods......Page 295
References......Page 305
back-matter.pdf......Page 307