This book is an introduction to the modern theory of Markov chains, whose goal is to determine the rate of convergence to the stationary distribution, as a function of state space size and geometry. This topic has important connections to combinatorics, statistical physics, and theoretical computer science. Many of the techniques presented originate in these disciplines.
The central tools for estimating convergence times, including coupling, strong stationary times, and spectral methods, are developed. The authors discuss many examples, including card shuffling and the Ising model, from statistical mechanics, and present the connection of random walks to electrical networks and apply it to estimate hitting and cover times.
The first edition has been used in courses in mathematics and computer science departments of numerous universities. The second edition features three new chapters (on monotone chains, the exclusion process, and stationary times) and also includes smaller additions and corrections throughout. Updated notes at the end of each chapter inform the reader of recent research developments.
Author(s): David A. Levin, Yuval Peres, Elizabeth L. Wilmer, James G. Propp, David B. Wilson
Series: MBK/107
Edition: Second
Publisher: American Mathematical Society
Year: 2017
Language: English
Pages: 463
Tags: Stochastic
Title page
Contents
Preface
Preface to the Second Edition
Preface to the First Edition
Overview
For the Reader
For the Instructor
For the Expert
Acknowledgements
Part I: Basic Methods and Examples
Chapter 1. Introduction to Finite Markov Chains
1.1. Markov Chains
1.2. Random Mapping Representation
1.3. Irreducibility and Aperiodicity
1.4. Random Walks on Graphs
1.5. Stationary Distributions
1.6. Reversibility and Time Reversals
1.7. Classifying the States of a Markov Chain*
Exercises
Notes
Chapter 2. Classical (and Useful) Markov Chains
2.1. Gambler’s Ruin
2.2. Coupon Collecting
2.3. The Hypercube and the Ehrenfest Urn Model
2.4. The Pólya Urn Model
2.5. Birth-and-Death Chains
2.6. Random Walks on Groups
2.7. Random Walks on \Z and Reflection Principles
Exercises
Notes
Chapter 3. Markov Chain Monte Carlo: Metropolis and Glauber Chains
3.1. Introduction
3.2. Metropolis Chains
3.3. Glauber Dynamics
Exercises
Notes
Chapter 4. Introduction to Markov Chain Mixing
4.1. Total Variation Distance
4.2. Coupling and Total Variation Distance
4.3. The Convergence Theorem
4.4. Standardizing Distance from Stationarity
4.5. Mixing Time
4.6. Mixing and Time Reversal
4.7. ℓ^{?} Distance and Mixing
Exercises
Notes
Chapter 5. Coupling
5.1. Definition
5.2. Bounding Total Variation Distance
5.3. Examples
5.4. Grand Couplings
Exercises
Notes
Chapter 6. Strong Stationary Times
6.1. Top-to-Random Shuffle
6.2. Markov Chains with Filtrations
6.3. Stationary Times
6.4. Strong Stationary Times and Bounding Distance
6.5. Examples
6.6. Stationary Times and Cesàro Mixing Time
6.7. Optimal Strong Stationary Times*
Exercises
Notes
Chapter 7. Lower Bounds on Mixing Times
7.1. Counting and Diameter Bounds
7.2. Bottleneck Ratio
7.3. Distinguishing Statistics
7.4. Examples
Exercises
Notes
Chapter 8. The Symmetric Group and Shuffling Cards
8.1. The Symmetric Group
8.2. Random Transpositions
8.3. Riffle Shuffles
Exercises
Notes
Chapter 9. Random Walks on Networks
9.1. Networks and Reversible Markov Chains
9.2. Harmonic Functions
9.3. Voltages and Current Flows
9.4. Effective Resistance
9.5. Escape Probabilities on a Square
Exercises
Notes
Chapter 10. Hitting Times
10.1. Definition
10.2. Random Target Times
10.3. Commute Time
10.4. Hitting Times on Trees
10.5. Hitting Times for Eulerian Graphs
10.6. Hitting Times for the Torus
10.7. Bounding Mixing Times via Hitting Times
10.8. Mixing for the Walk on Two Glued Graphs
Exercises
Notes
Chapter 11. Cover Times
11.1. Definitions
11.2. The Matthews Method
11.3. Applications of the Matthews Method
11.4. Spanning Tree Bound for Cover Time
11.5. Waiting for All Patterns in Coin Tossing
Exercises
Notes
Chapter 12. Eigenvalues
12.1. The Spectral Representation of a Reversible Transition Matrix
12.2. The Relaxation Time
12.3. Eigenvalues and Eigenfunctions of Some Simple Random Walks
12.4. Product Chains
12.5. Spectral Formula for the Target Time
12.6. An ℓ² Bound
12.7. Time Averages
Exercises
Notes
Part II: The Plot Thickens
Chapter 13. Eigenfunctions and Comparison of Chains
13.1. Bounds on Spectral Gap via Contractions
13.2. The Dirichlet Form and the Bottleneck Ratio
13.3. Simple Comparison of Markov Chains
13.4. The Path Method
13.5. Wilson’s Method for Lower Bounds
13.6. Expander Graphs*
Exercises
Notes
Chapter 14. The Transportation Metric and Path Coupling
14.1. The Transportation Metric
14.2. Path Coupling
14.3. Rapid Mixing for Colorings
14.4. Approximate Counting
Exercises
Notes
Chapter 15. The Ising Model
15.1. Fast Mixing at High Temperature
15.2. The Complete Graph
15.3. The Cycle
15.4. The Tree
15.5. Block Dynamics
15.6. Lower Bound for the Ising Model on Square*
Exercises
Notes
Chapter 16. From Shuffling Cards to Shuffling Genes
16.1. Random Adjacent Transpositions
16.2. Shuffling Genes
Exercises
Notes
Chapter 17. Martingales and Evolving Sets
17.1. Definition and Examples
17.2. Optional Stopping Theorem
17.3. Applications
17.4. Evolving Sets
17.5. A General Bound on Return Probabilities
17.6. Harmonic Functions and the Doob ℎ-Transform
17.7. Strong Stationary Times from Evolving Sets
Exercises
Notes
Chapter 18. The Cutoff Phenomenon
18.1. Definition
18.2. Examples of Cutoff
18.3. A Necessary Condition for Cutoff
18.4. Separation Cutoff
Exercises
Notes
Chapter 19. Lamplighter Walks
19.1. Introduction
19.2. Relaxation Time Bounds
19.3. Mixing Time Bounds
19.4. Examples
Exercises
Notes
Chapter 20. Continuous-Time Chains*
20.1. Definitions
20.2. Continuous-Time Mixing
20.3. Spectral Gap
20.4. Product Chains
Exercises
Notes
Chapter 21. Countable State Space Chains*
21.1. Recurrence and Transience
21.2. Infinite Networks
21.3. Positive Recurrence and Convergence
21.4. Null Recurrence and Convergence
21.5. Bounds on Return Probabilities
Exercises
Notes
Chapter 22. Monotone Chains
22.1. Introduction
22.2. Stochastic Domination
22.3. Definition and Examples of Monotone Markov Chains
22.4. Positive Correlations
22.5. The Second Eigenfunction
22.6. Censoring Inequality
22.7. Lower Bound on ?
22.8. Proof of Strassen’s Theorem
Exercises
Notes
Chapter 23. The Exclusion Process
23.1. Introduction
23.2. Mixing Time of ?-Exclusion on the ?-Path
23.3. Biased Exclusion
Exercises
Notes
Chapter 24. Cesàro Mixing Time, Stationary Times, and Hitting Large Sets
24.1. Introduction
24.2. Equivalence of \tstop, \tces, and \tg for Reversible Chains
24.3. Halting States and Mean-Optimal Stopping Times
24.4. Regularity Properties of Geometric Mixing Times*
24.5. Equivalence of \tg and \tHit
24.6. Upward Skip-Free Chains
24.7. \tHit(?) Are Comparable for ?≤1/2
24.8. An Upper Bound on \trel
24.9. Application to Robustness of Mixing
Exercises
Notes
Chapter 25. Coupling from the Past
25.1. Introduction
25.2. Monotone CFTP
25.3. Perfect Sampling via Coupling from the Past
25.4. The Hardcore Model
25.5. Random State of an Unknown Markov Chain
Exercise
Notes
Chapter 26. Open Problems
26.1. The Ising Model
26.2. Cutoff
26.3. Other Problems
26.4. Update: Previously Open Problems
Appendix A. Background Material
A.1. Probability Spaces and Random Variables
A.2. Conditional Expectation
A.3. Strong Markov Property
A.4. Metric Spaces
A.5. Linear Algebra
A.6. Miscellaneous
Exercise
Appendix B. Introduction to Simulation
B.1. What Is Simulation?
B.2. Von Neumann Unbiasing*
B.3. Simulating Discrete Distributions and Sampling
B.4. Inverse Distribution Function Method
B.5. Acceptance-Rejection Sampling
B.6. Simulating Normal Random Variables
B.7. Sampling from the Simplex
B.8. About Random Numbers
B.9. Sampling from Large Sets*
Exercises
Notes
Appendix C. Ergodic Theorem
C.1. Ergodic Theorem*
Exercise
Appendix D. Solutions to Selected Exercises
Bibliography
Notation Index
Index