This 2nd edition is a thoroughly revised and augmented version of the book with the same title published in 1999. The author begins with the elementary theory of Markov chains and very progressively brings the reader to more advanced topics. He gives a useful review of probability, making the book self-contained, and provides an appendix with detailed proofs of all the prerequisites from calculus, algebra, and number theory. A number of carefully chosen problems of varying difficulty are proposed at the close of each chapter, and the mathematics is slowly and carefully developed, in order to make self-study easier. The book treats the classical topics of Markov chain theory, both in discrete time and continuous time, as well as connected topics such as finite Gibbs fields, nonhomogeneous Markov chains, discrete-time regenerative processes, Monte Carlo simulation, simulated annealing, and queuing theory.
The main additions of the 2nd edition are the exact sampling algorithm of Propp and Wilson, the electrical network analogy of symmetric random walks on graphs, mixing times and additional details on the branching process. The structure of the book has been modified in order to smoothly incorporate this new material. Among the features that should improve reader-friendliness, the three main ones are: a shared numbering system for the definitions, theorems and examples; the attribution of titles to the examples and exercises; and the blue highlighting of important terms. The result is an up-to-date textbook on stochastic processes.
Students and researchers in operations research and electrical engineering, as well as in physics and biology, will find it very accessible and relevant.
Author(s): Pierre Brémaud
Series: Texts in Applied Mathematics 31
Edition: 2
Publisher: Springer
Year: 2020
Language: English
Pages: 564
Tags: Markov Chain, Recurrence, Ergodicity, Renewal Theory, Martingales, Gibbs Fields, MCMC, Queueing Theory
Preface
From Pushkin to Monte Carlo
A useful, simple, and beautiful theory
The second edition
Course suggestions
Acknowledgements
Contents
Chapter 1 Probability Review
1.1 Basic Concepts
1.1.1 Events, Random Variables and Probability
The Logics of Events
Random Variables
Probability
1.1.2 Independence and Conditional Probability
Bayes’ Rules
Markov Property
1.1.3 Expectation
Cumulative Distribution Function
Expectation, Mean, and Variance
Famous Random Variables
Absolutely Continuous Random Vectors
Discrete Random Vectors
The Product Formula for Expectation
1.1.4 Conditional Expectation
The Rule of Successive Conditioning
1.2 Transforms of Probability Distributions
1.2.1 Generating Functions
1.2.2 Characteristic Functions
1.3 Transformations of Random Vectors
1.3.1 Smooth Change of Variables
1.3.2 Order Statistics
1.4 Almost-Sure Convergence
1.4.1 Two Basic Tools
The Borel–Cantelli Lemma
Markov’s Inequality
1.4.2 The Strong Law of Large Numbers
Proof of Kolmogorov’s Strong Law of Large Numbers
1.5 Exercises
Chapter 2 Discrete-Time Markov Chains
2.1 The Transition Matrix
2.1.1 The Markov Property
The Transition Graph
2.1.2 The Distribution of an HMC
2.2 Markov Recurrences
2.2.1 A Canonical Representation
2.2.2 First-Step Analysis
Absorption Probability
Mean Time to Absorption
2.3 Topology of the Transition Matrix
2.3.1 Communication
2.3.2 Period
2.4 Steady State
2.4.1 Stationarity
2.4.2 Time Reversal
Reversed Chain
Time Reversibility
Random Walk on a Graph
2.5 Regeneration
2.5.1 The Strong Markov Property
2.5.2 Regenerative Cycles
2.6 Exercises
Chapter 3 Recurrence and Ergodicity
3.1 The Potential Matrix Criterion
3.1.1 Recurrent and Transient States
3.1.2 A Criterion of Recurrence
3.1.3 Structure of the Transition Matrix
3.2 Recurrence
3.2.1 Invariant Measures
3.2.2 A Positive Recurrence Criterion
3.3 Empirical Averages
3.3.1 The Ergodic Theorem
3.3.2 The Renewal Reward Theorem
3.4 Exercises
Chapter 4 Long-Run Behavior
4.1 Coupling
4.1.1 Convergence in Variation
4.1.2 The Coupling Method
4.2 Convergence to Steady State
4.2.1 The Positive Recurrent Case
4.2.2 The Null Recurrent Case
4.3 Convergence Rates, a First Look
4.3.1 Convergence Rates via Coupling
4.3.2 The Perron–Frobenius Theorem
4.3.3 Quasi-stationary Distributions
4.3.4 Dobrushin’s Ergodic Coefficient
4.4 Exercises
Chapter 5 Discrete-Time Renewal Theory
5.1 The Renewal Process
5.1.1 The Renewal Equation
5.1.2 The Renewal Theorem
5.1.3 Defective Renewal Sequences
5.2 Regenerative Processes
5.2.1 Renewal Equations for Regenerative Processes
5.2.2 The Regenerative Theorem
5.3 Exercises
Chapter 6 Absorption and Passage Times
6.1 Life Before Absorption
6.1.1 Infinite Sojourns
6.1.2 Time to Absorption
6.2 Absorption Probabilities
6.2.1 The First Fundamental Matrix
6.2.2 The Absorption Matrix
6.2.3 Hitting Times Formula
6.3 The Second Fundamental Matrix
6.3.1 Definition
An Extension of the Fundamental Matrix
6.3.2 The Mutual Time-Distance Matrix
6.3.3 Variance of Ergodic Estimates
6.4 The Branching Process
6.4.1 The Galton–Watson Model
The Standard Description
Probability of Extinction
6.4.2 Tail Distributions
Tail of the Extinction Time Distribution
Tail of the Total Population Distribution
6.4.3 Conditioning by Extinction
6.5 Exercises
Chapter 7 Lyapunov Functions and Martingales
7.1 Lyapunov Functions
7.1.1 Foster’s Theorem
7.1.2 Queueing Applications
7.2 Martingales and Potentials
7.2.1 Harmonic Functions and Martingales
7.2.2 The Maximum Principle
7.3 Martingales and HMCs
7.3.1 The Two Pillars of Martingale Theory
7.3.2 Transience and Recurrence via Martingales
7.3.3 Absorption via Martingales
7.4 Exercises
Chapter 8 Random Walks on Graphs
8.1 Pure Random Walks
8.1.1 The Symmetric Random Walks on Z and Z3
The Symmetric Random Walk on Z3 and Z2
8.1.2 The Pure Random Walk on a Graph
8.1.3 Spanning Trees and Cover Times
8.2 Symmetric Walks on a Graph
8.2.1 Reversible Chains as Symmetric Walks
8.2.2 The Electrical Network Analogy
The Probabilistic Interpretation of Voltage
The Probabilistic Interpretation of Current
8.3 Effective Resistance and Escape Probability
8.3.1 Computation of the Effective Resistance
8.3.2 Thompson’s and Rayleigh’s Principles
8.3.3 Infinite Networks
8.4 Exercises
Chapter 9 Convergence Rates
9.1 Reversible Transition Matrices
9.1.1 A Characterization of Reversibility
9.1.2 Convergence Rates in Terms of the SLEM
9.1.3 Rayleigh’s Spectral Theorem
9.2 Bounds for the SLEM
9.2.1 Bounds via Rayleigh’s Characterization
Weighted Paths
The Bottleneck Bound
9.2.2 Strong Stationary Times
Convergence Rates via Strong Stationary Times
9.2.3 Reversibilization
9.3 Mixing Times
9.3.1 Basic Definitions
9.3.2 Upper Bounds via Coupling
9.3.3 Lower Bounds
9.4 Exercises
Chapter 10 Markov Fields on Graphs
10.1 The Gibbs–Markov Equivalence
10.1.1 Local Characteristics
10.1.2 Gibbs Distributions
The Hammersley–Clifford Theorem
10.2 Specific Models
10.2.1 Random Points
10.2.2 The Auto-binomial Texture Model
10.2.3 The Pixel-Edge Model
10.2.4 Markov Fields in Image Processing
Penalty Methods
10.3 Phase Transition in the Ising Model
10.3.1 Experimental Results
The DLR Problem
10.3.2 The Peierls Argument
10.4 Exercises
Chapter 11 Monte Carlo Markov Chains
11.1 General Principles of Simulation
11.1.1 Simulation via the Law of Large Numbers
11.1.2 Two Methods for Sampling a PDF
Method of the Inverse
The Acceptance-Rejection Method
The Shortcomings
11.2 Monte Carlo Markov Chain Sampling
11.2.1 The Basic Idea
11.2.2 Convergence Rates in MCMC
11.2.3 Variance of Monte Carlo Estimators
11.3 The Gibbs Sampler
11.3.1 Simulation of Random Fields
11.3.2 Convergence Rate of the Gibbs Sampler
11.4 Exact sampling
11.4.1 The Propp–Wilson algorithm
11.4.2 Sandwiching
11.5 Exercises
Chapter 12 Non-homogeneous Markov Chains
12.1 Weak and Strong Ergodicity
12.1.1 Ergodicity of Non-Homogeneous Markov Chains
12.1.2 The Block Criterion of Weak Ergodicity
12.1.3 A Sufficient Condition for Strong Ergodicity
Bounded Variation Extensions
12.2 Simulated Annealing
12.2.1 Stochastic Descent and Cooling
12.2.2 Convergence of Simulated Annealing
Fast Cooling
12.3 Exercises
Chapter 13 Continuous-Time Markov Chains
13.1 Poisson Processes
13.1.1 Point Processes
13.1.2 The Counting Process of an
13.1.3 Competing Poisson Processes
13.2 Distribution of a Continuous-Time
13.2.1 The Transition Semi-Group
13.2.2 The Infinitesimal Generator
13.2.3 Kolmogorov’s Differential Systems
13.3 The Regenerative Structure
13.3.1 The Strong Markov Property
13.3.2 Embedded Chain and Transition Times
13.3.3 Explosions
13.4 Recurrence and Long-Run Behavior
13.4.1 Stationary Distribution Criterion of Ergodicity
Time Reversal
13.4.2 Ergodicity
Convergence Rates
13.4.3 Absorption
13.5 Continuous-Time HMCs from HPPs
13.5.1 The Strong Markov Property of HPPs
13.5.2 From Generator to Markov Chain
Paying Two Debts
13.5.3 Poisson Calculus and Continuous-time HMCs
Watanabe’s Characterization of Poisson Processes
Kolmogorov’s Forward System via Poisson Calculus
Aggregation of States
13.6 Exercises
Chapter 14 Markovian Queueing Theory
14.1 Poisson Systems
14.1.1 The Purely Poissonian Description
Infinitesimal Generator of a Poisson System
14.1.2 Markovian Queues as Poisson Systems
14.2 Isolated Markovian Queues
14.2.1 As Birth-and-Death Processes
14.2.2 The M/GI/1/∞/FIFO Queue
The Embedded Congestion Process
Embedded Sojourn Process
14.2.3 The GI/M/1/∞/FIFO Queue
An Embedded Waiting Time Process
14.3 Markovian Queueing Networks
14.3.1 The Tandem Network
14.3.2 The Jackson Network
14.3.3 The Gordon–Newell Network
14.4 Exercises
Appendix A
A.1 Number Theory and Calculus
A.1.1 Greatest Common Divisor
A.1.2 Abel’s Theorem
A.1.3 Ces`aro’s Lemma
A.1.4 Lebesgue’s Theorems for Series
A.1.5 Infinite Products
A.1.6 Tychonov’s Theorem
A.1.7 Subadditive Functions
A.2 Linear Algebra
A.2.1 Eigenvalues and Eigenvectors
A.2.2 Exponential of a Matrix
A.2.3 Gershgorin’s Bound
A.3 Probability and Expectation
A.3.1 Expectation Revisited
A.3.2 Lebesgue’s Theorem for Expectation
Bibliography
Index