Markov chains are a fundamental class of stochastic processes. They are widely used to solve problems in a large number of domains such as operational research, computer science, communication networks and manufacturing systems. The success of Markov chains is mainly due to their simplicity of use, the large number of available theoretical results and the quality of algorithms developed for the numerical evaluation of many metrics of interest.
The author presents the theory of both discrete-time and continuous-time homogeneous Markov chains. He carefully examines the explosion phenomenon, the Kolmogorov equations, the convergence to equilibrium and the passage time distributions to a state and to a subset of states. These results are applied to birth-and-death processes. He then proposes a detailed study of the uniformization technique by means of Banach algebra. This technique is used for the transient analysis of several queuing systems.
Contents
1. Discrete-Time Markov Chains
2. Continuous-Time Markov Chains
3. Birth-and-Death Processes
4. Uniformization
5. Queues
About the Authors
Bruno Sericola is a Senior Research Scientist at Inria Rennes – Bretagne Atlantique in France. His main research activity is in performance evaluation of computer and communication systems, dependability analysis of fault-tolerant systems and stochastic models.
Author(s): Bruno Sericola
Series: ISTE
Edition: 1st
Publisher: Wiley-ISTE
Year: 2013
Language: English
Pages: 416
Tags: Математика;Теория вероятностей и математическая статистика;Теория случайных процессов;
Cover
......Page 1
Title Page
......Page 5
Contents
......Page 7
Preface
......Page 11
1.1. Definitions and properties......Page 15
1.2. Strong Markov property......Page 19
1.3. Recurrent and transient states......Page 22
1.4. State classification......Page 26
1.5. Visits to a state......Page 28
1.6. State space decomposition......Page 32
1.7. Irreducible and recurrent Markov chains......Page 36
1.8. Aperiodic Markov chains......Page 44
1.9. Convergence to equilibrium......Page 48
1.10. Ergodic theorem......Page 55
1.11.1. First passage time to a state......Page 67
1.11.2. First passage time to a subset of states......Page 72
1.11.3. Expected number of visits......Page 78
1.12. Finite Markov chains......Page 82
1.13. Absorbing Markov chains......Page 84
1.14.1. Two-state chain......Page 90
1.14.2. Gambler’s ruin......Page 92
1.14.3. Success runs......Page 96
1.15. Bibliographical notes......Page 101
Chapter 2. Continuous-Time Markov Chains......Page 103
2.1. Definitions and properties......Page 106
2.2. Transition functions and infinitesimal generator......Page 107
2.3. Kolmogorov’s backward equation......Page 122
2.4. Kolmogorov’s forward equation......Page 128
2.5. Existence and uniqueness of the solutions......Page 141
2.6. Recurrent and transient states......Page 144
2.7. State classification......Page 151
2.8. Explosion......Page 155
2.9. Irreducible and recurrent Markov chains......Page 162
2.10. Convergence to equilibrium......Page 176
2.11. Ergodic theorem......Page 180
2.12.1. First passage time to a state......Page 186
2.12.2. First passage time to a subset of states......Page 191
2.13. Absorbing Markov chains......Page 198
2.14. Bibliographical notes......Page 204
3.1. Discrete-time birth-and-death processes......Page 205
3.2. Absorbing discrete-time birth-and-death processes......Page 214
3.2.1. Passage times and convergence to equilibrium......Page 215
3.2.2. Expected number of visits......Page 218
3.3. Periodic discrete-time birth-and-death processes......Page 222
3.4. Continuous-time pure birth processes......Page 223
3.5. Continuous-time birth-and-death processes......Page 227
3.5.1. Explosion......Page 229
3.5.2. Positive recurrence......Page 231
3.5.3. First passage time......Page 234
3.5.4. Explosive chain having an invariant probability......Page 239
3.5.5. Explosive chain without invariant probability......Page 240
3.5.6. Positive or null recurrent embedded chain......Page 241
3.6. Absorbing continuous-time birth-and-death processes......Page 242
3.6.1. Passage times and convergence to equilibrium......Page 243
3.6.2. Explosion......Page 245
3.7. Bibliographical notes......Page 247
4.1. Introduction......Page 249
4.2. Banach spaces and algebra......Page 251
4.3. Infinite matrices and vectors......Page 257
4.4. Poisson process......Page 263
4.4.1. Order statistics......Page 266
4.4.2. Weighted Poisson distribution computation......Page 269
4.4.3. Truncation threshold computation......Page 272
4.5. Uniformizable Markov chains......Page 277
4.6. First passage time to a subset of states......Page 287
4.7. Finite Markov chains......Page 289
4.8.1. State probabilities computation......Page 290
4.8.2. First passage time distribution computation......Page 294
4.8.3. Application to birth-and-death processes......Page 296
4.9. Bibliographical notes......Page 300
Chapter 5. Queues......Page 301
5.1. The M/M/1 queue......Page 302
5.1.1. State probabilities......Page 304
5.1.2. Busy period distribution......Page 325
5.2. The M/M/c queue......Page 329
5.3. The M/M/∞ queue
......Page 332
5.4. Phase-type distributions......Page 337
5.5.1. Definition and transient regime......Page 340
5.5.2. Joint distribution of the interarrival times......Page 350
5.5.3. Phase-type renewal processes......Page 355
5.6.1. Definition and transient regime......Page 356
5.6.2. Joint distribution of the interarrival times......Page 363
5.7. Block-structured Markov chains......Page 366
5.7.1. Transient regime of SFL chains......Page 368
5.7.2. Transient regime of SFR chains......Page 377
5.8.1. The M/PH/1 queue......Page 384
5.8.3. The PH/PH/1 queue......Page 386
5.8.4. The PH/PH/c queue......Page 387
5.8.5. The BMAP/PH/1 queue......Page 390
5.8.6. The BMAP/PH/c queue......Page 391
5.9. Bibliographical notes......Page 394
Appendix 1. Basic Results......Page 395
Bibliography......Page 401
Index......Page 409