Probability and statistics by example. Markov chains: a primer in random processes and their applications

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Probability and Statistics are as much about intuition and problem solving as they are about theorem proving. Because of this, students can find it very difficult to make a successful transition from lectures to examinations to practice, since the problems involved can vary so much in nature. Since the subject is critical in many modern applications such as mathematical finance, quantitative management, telecommunications, signal processing, bioinformatics, as well as traditional ones such as insurance, social science and engineering, the authors have rectified deficiencies in traditional lecture-based methods by collecting together a wealth of exercises with complete solutions, adapted to needs and skills of students. Following on from the success of Probability and Statistics by Example: Basic Probability and Statistics, the authors here concentrate on random processes, particularly Markov processes, emphasizing models rather than general constructions. Basic mathematical facts are supplied as and when they are needed and historical information is sprinkled throughout.

Author(s): Yuri Suhov, Mark Kelbert
Edition: 1
Publisher: Cambridge University Press
Year: 2008

Language: English
Pages: 499

Cover......Page 1
Half-title......Page 3
Title......Page 5
Copyright......Page 6
Contents......Page 7
Preface......Page 9
1.1 The Markov property and its immediate consequences......Page 13
1.2 Class division......Page 29
1.3 Hitting times and probabilities......Page 38
1.4 Strong Markov property......Page 47
1.5 Recurrence and transience: definitions and basic facts......Page 51
1.6 Recurrence and transience: random walks on lattices......Page 57
1.7 Equilibrium distributions: definitions and basic facts......Page 64
1.8 Positive and null recurrence......Page 70
1.9 Convergence to equilibrium. Long-run proportions......Page 82
1.10 Detailed balance and reversibility......Page 92
1.11 Controlled and partially observed Markov chains......Page 101
1.12 Geometric algebra of Markov chains, I. Eigenvalues and spectral gaps......Page 111
1.13 Geometric algebra of Markov chains, II. Random walks on graphs......Page 128
1.14 Geometric algebra of Markov chains, III. The Poincar´e and Cheeger bounds......Page 142
1.15 Large deviations for discrete-time Markov chains......Page 150
1.16 Examination questions on discrete-time Markov chains......Page 167
2.1 Q-matrices and transition matrices......Page 197
2.2 Continuous-time Markov chains: definitions and basic constructions......Page 208
2.3 The Poisson process......Page 222
2.4 Inhomogeneous Poisson process......Page 243
2.5 Birth-and-death process. Explosion......Page 252
2.6 Continuous-time Markov chains with countably many states......Page 262
2.7 Hitting times and probabilities. Recurrence and transience......Page 278
2.8 Convergence to an equilibrium distribution. Reversibility......Page 295
2.9 Applications to queueing theory. Markovian queues......Page 303
2.10 Examination questions on continuous-time Markov chains......Page 320
3.1 Introduction......Page 361
3.2 Likelihood functions, 1. Maximum likelihood estimators......Page 369
3.3 Consistency of estimators. Various forms of convergence......Page 378
3.4 Likelihood functions, 2. Whittle’s formula......Page 402
3.5 Bayesian analysis of Markov chains: prior and posterior distributions......Page 413
3.6 Elements of control and information theory......Page 427
3.7 Hidden Markov models, 1. State estimation for Markov chains......Page 446
3.8 Hidden Markov models, 2. The Baum–Welch learning algorithm......Page 463
3.9 Generalisations of the Baum–Welch algorithm. Global convergence of iterations......Page 473
Epilogue: Andrei Markov and his Time......Page 491
Bibliography......Page 495
Index......Page 497