This textbook explains the fundamentals of Markov Chain Monte Carlo (MCMC) without assuming advanced knowledge of mathematics and programming. MCMC is a powerful technique that can be used to integrate complicated functions or to handle complicated probability distributions. MCMC is frequently used in diverse fields where statistical methods are important – e.g. Bayesian statistics, quantum physics, machine learning, computer science, computational biology, and mathematical economics. This book aims to equip readers with a sound understanding of MCMC and enable them to write simulation codes by themselves.
The content consists of six chapters. Following Chap. 2, which introduces readers to the Monte Carlo algorithm and highlights the advantages of MCMC, Chap. 3 presents the general aspects of MCMC. Chap. 4 illustrates the essence of MCMC through the simple example of the Metropolis algorithm. In turn, Chap. 5 explains the HMC algorithm, Gibbs sampling algorithm and Metropolis-Hastings algorithm, discussing their pros, cons and pitfalls. Lastly, Chap. 6 presents several applications of MCMC. Including a wealth of examples and exercises with solutions, as well as sample codes and further math topics in the Appendix, this book offers a valuable asset for students and beginners in various fields.
Author(s): Masanori Hanada, So Matsuura
Publisher: Springer
Year: 2022
Language: English
Pages: 197
City: Singapore
Acknowledgements
Contents
1 Introduction
2 What is the Monte Carlo Method?—A Simulation with Random Numbers
2.1 Random Numbers
2.1.1 Uniform Random Numbers
2.1.2 Gaussian Random Numbers (Normal Distribution)
2.1.3 Random Numbers Versus Pseudorandom Numbers
2.2 Integration Utilizing Uniform Random Numbers
2.2.1 Calculation of π with Uniform Random Numbers
2.2.2 Integral with Uniform Random Numbers
2.2.3 The Importance of the Importance Sampling
2.3 Expectation Value and Integral
2.4 Calculation of Expectation Value with Gaussian Random Numbers
2.4.1 Box-Muller Method
2.4.2 Expectation Values from the Gaussian Distribution
2.5 Example in Which Randomness is Essential
References
3 General Aspects of Markov Chain Monte Carlo
3.1 Markov Chain
3.2 Irreducibility
3.3 Aperiodicity
3.4 Detailed Balance Condition
3.5 Exercises
References
4 Metropolis Algorithm
4.1 Metropolis Algorithm
4.2 Calculation of Expectation Value
4.3 Autocorrelation
4.3.1 Correlation with the Initial Value and Thermalization (Burn-in)
4.3.2 Autocorrelation
4.3.3 Jackknife Method
4.3.4 Adjustment of the Step Size
4.3.5 Box-Muller Method Revisited
4.4 Examples Other Than the Gaussian Distribution
4.5 Application to Complicated Integrals
4.6 Sign Problem
4.7 Common Mistakes
4.7.1 Changing Step Size in the Middle of the Simulation
4.7.2 Mixing the Configurations Obtained by Using Different Step Sizes
4.7.3 ``Random Numbers'' Were Not Really Random
4.8 Multivariate Metropolis Algorithm
4.8.1 Multivariate Gaussian Distribution
4.9 Exercises
References
5 Other Useful Algorithms
5.1 The HMC Algorithm
5.1.1 Intuition from Physics
5.1.2 Leapfrog Method
5.1.3 Checking the Conditions for MCMC
5.1.4 Univariate HMC
5.1.5 Multivariate HMC
5.1.6 Tuning the Simulation Parameters
5.1.7 Using Different Step Sizes for Different Variables
5.1.8 Useful Tips for Debugging
5.2 Gibbs Sampling Algorithm (Heat Bath Algorithm)
5.2.1 Bivariate Gaussian Distribution
5.2.2 Multivariate Gibbs Sampling Algorithm
5.2.3 Gibbs Sampling Simulation
5.3 Metropolis-Hastings Algorithm (MH Algorithm)
5.3.1 Advantage over the Metropolis Algorithm
5.3.2 Gibbs Sampling is Metropolis-Hastings
5.3.3 HMC is Metropolis-Hastings
5.3.4 More Elementary Example
5.4 Combination of Different Algorithms
5.5 Exercises
References
6 Applications of Markov Chain Monte Carlo
6.1 Likelihood and Bayesian Statistics
6.1.1 Defining the ``Likelihood'' Quantitatively
6.1.2 Calculation of Likelihood
6.1.3 Bayesian Statistics
6.1.4 Bayes' Theorem
6.1.5 Bayesian Updating via MCMC
6.2 Ising Model
6.2.1 Ising Model via Metropolis
6.2.2 Ising Model via Gibbs Sampling (Heat Bath)
6.2.3 Simulation of Two-Dimensional Ising Model
6.2.4 Critical Slowing Down and Cluster Algorithm
6.2.5 Wolff Algorithm
6.3 Combinatorial Optimization and Traveling Salesman Problem
6.3.1 Minimization and Local Optimum
6.3.2 Simulated Annealing Algorithm
6.3.3 Replica Exchange Algorithm
6.4 Applications to High-Energy Physics
6.4.1 Quantum Chromodynamics (QCD)
6.4.2 Superstring Theory and Holographic Principle
6.4.3 Further Optimization
References
Appendix A List of Sample Code
A.1 Naive Monte Carlo Method (Not MCMC)
A.1.1 Calculation of left 1π (Sect. 2.2.1摥映數爠eflinksec:pispsbyspsMC2.2.12)
A.1.2 Calculation of left 11π (Sect. 2.2.2摥映數爠eflinksec:pispsbyspsMCsps22.2.22)
A.1.3 Calculation of the Volume of a Ball left 23x2+y2+z2le1 (Sect. 2.3摥映數爠eflinksec:MCspsuniformspsRNspsintegral2.32)
A.2 Metropolis Algorithm
A.2.1 Univariate Gaussian Integration via Metropolis (Sect. 4.2摥映數爠eflinksec:Metropolisspsexample4.24)
A.2.2 Bivariate Gaussian Integration via Metropolis (Sect. 4.8.1摥映數爠eflinksec:MetropolisspsGaussianspsmultispsvariables4.8.14)
A.2.3 Two-Dimensional Ising Model via Metropolis (Sect. 6.2.3摥映數爠eflinksec:2dspsIsingspsmetropolisspsHeatbath6.2.36)
A.2.4 Bayesian Updating for Coin Toss via Metropolis (Sect. 6.1.5摥映數爠eflinksec:MCMCspsBayes6.1.56)
A.2.5 Bayesian Updating for Bivariate Gaussian Distribution via Metropolis (Sect. 6.1.5摥映數爠eflinksec:MCMCspsBayes6.1.56)
A.3 HMC Algorithm
A.3.1 Univariate Gaussian Integration via HMC (Sect. 5.1.4摥映數爠eflinksec:gaussiansps1spsvariablespsHMC5.1.45)
A.3.2 Multivariate Gaussian Integration via HMC (Sect. 5.1.5摥映數爠eflinksec:HMCspsmultispsvariable5.1.55)
A.3.3 Bayesian Updating for Bivariate Gaussian Distribution via HMC (Sect. 6.1.5摥映數爠eflinksec:MCMCspsBayes6.1.56)
A.3.4 Matrix Integral via HMC (Sect. 5.1.5摥映數爠eflinksec:HMCspsmultispsvariable5.1.55)
A.4 Gibbs Sampling Algorithm (Heat Bath Algorithm)
A.4.1 Multivariate Gaussian Integration via Gibbs Sampling (Sect. 5.2.3摥映數爠eflinksec:GibbsspsSamplingspsexamples5.2.35)
A.4.2 Two-Dimensional Ising Model via Gibbs Sampling (Sect. 6.2.3摥映數爠eflinksec:2dspsIsingspsmetropolisspsHeatbath6.2.36)
A.5 Combination of Different Algorithms
A.5.1 Bayesian Updating for Bivariate Gaussian Distribution via Gibbs Sampling and Metropolis (Sect. 6.1.5摥映數爠eflinksec:MCMCspsBayes6.1.56)
A.6 Cluster Algorithm
A.6.1 Two-Dimensional Ising Model via the Wolff Algorithm (Sect. 6.2.4摥映數爠eflinksec:clusterspsalgorithm6.2.46)
A.7 Replica Exchange Algorithm
A.7.1 Simple Integral via Replica Exchange (Sect. 6.3.3摥映數爠eflinksec:replicaspsexchange6.3.36)
A.7.2 Traveling Salesman Problem via Replica Exchange (Sect. 6.3.3摥映數爠eflinksec:replicaspsexchange6.3.36)
Appendix B Miscellaneous Math Topics
B.1 Matrix
B.2 Gaussian Integral
B.2.1 Univariate Gaussian Integration
B.2.2 Multivariate Gaussian Integration
Appendix C Hamilton Equation
Appendix D Jackknife Method in Generic Cases
Appendix E Conjugate Gradient Method (CG Method)
E.1 BiCG Method
E.2 Multi-mass CG Method
References