Bayesian Scientific Computing

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

The once esoteric idea of embedding scientific computing into a probabilistic framework, mostly along the lines of the Bayesian paradigm, has recently enjoyed wide popularity and found its way into numerous applications.  This book provides an insider’s view of how to combine two mature fields, scientific computing and Bayesian inference, into a powerful language leveraging the capabilities of both components for computational efficiency, high resolution power and uncertainty quantification ability.  The impact of Bayesian scientific computing has been particularly significant in the area of computational inverse problems where the data are often scarce or of low quality, but some characteristics of the unknown solution may be available a priori. The ability to combine the flexibility of the Bayesian probabilistic framework with efficient numerical methods has contributed to the popularity of Bayesian inversion, with the prior distribution being the counterpart of classical regularization.  However, the interplay between Bayesian inference and numerical analysis is much richer than providing an alternative way to regularize inverse problems, as demonstrated by the discussion of time dependent problems, iterative methods, and sparsity promoting priors in this book. The quantification of uncertainty in computed solutions and model predictions is another area where Bayesian scientific computing plays a critical role.  This book demonstrates that Bayesian inference and scientific computing have much more in common than what one may expect, and gradually builds a natural interface between these two areas.


Author(s): Daniela Calvetti, Erkki Somersalo
Series: Applied Mathematical Sciences, 215
Publisher: Springer
Year: 2023

Language: English
Pages: 294
City: Cham

978-3-031-23824-6
1
Preface
Preface to the 2007 Book Introduction to Bayesian Scientific Computing
Contents
978-3-031-23824-6_1
1 Bayesian Scientific Computing and Inverse Problems
1.1 What Do We Talk About When We Talk About Random Variables?
1.2 Through the Formal Theory, Lightly
1.2.1 Elementary Probabilities
1.2.2 Probability Distributions and Densities
1.2.3 Expectation and Covariance
1.2.4 Change of Variables in Probability Densities
978-3-031-23824-6_2
2 Linear Algebra
2.1 Vectors and Matrices
2.1.1 The Singular Value Decomposition
2.1.2 The Four Fundamental Subspaces
2.2 Solving Linear Systems
2.2.1 What Is a Solution?
2.2.2 Direct Linear System Solvers
978-3-031-23824-6_3
3 Continuous and Discrete Multivariate Distributions
3.1 Covariance Matrices
3.2 Normal Distributions
3.3 How Normal is it to be Normal?
3.4 Discrete Distributions
3.4.1 Normal Approximation to the Poisson Distribution
978-3-031-23824-6_4
4 Introduction to Sampling
4.1 On Averaging
4.2 Whitening and P–P Plots
4.3 Quadratures and Law of Large Numbers
4.4 Drawing from Discrete Densities
4.5 Sampling from a One-Dimensional Continuous Density
4.6 Sampling from Gaussian Distributions
4.7 Some Useful Sampling Algorithms
4.7.1 Importance Sampling
4.7.2 Drawing from Mixtures: SIR and Weighted Bootstrap
4.8 Rejection Sampling: Prelude to Metropolis–Hastings
978-3-031-23824-6_5
5 The Praise of Ignorance: Randomness as Lack of Certainty
5.1 Construction of Likelihood
5.2 Noise Models
5.2.1 Additive Noise
5.2.2 Multiplicative Noise
5.2.3 Poisson Noise
5.2.4 Composite Noise Models
978-3-031-23824-6_6
6 Enter Subject: Construction of Priors
6.1 Smoothness Priors
6.1.1 Freeing the Boundary Values
6.2 Generalization to Higher Dimensions
6.3 Whittle–Matérn Priors
6.4 Smoothness Priors with Structure
6.5 Conditionally Gaussian Priors and Hierarchical Models
6.6 Sparsity-Promoting Priors
6.7 Kernel-Based Priors
6.8 Data-Driven Priors
978-3-031-23824-6_7
7 Posterior Densities, Ill-Conditioning, and Classical Regularization
7.1 Likelihood Densities and Ill-Posedness of Inverse Problems
7.2 Maximum a Posteriori Estimate and Regularization
978-3-031-23824-6_8
8 Conditional Gaussian Densities
8.1 Gaussian Conditional Densities
8.2 Linear Inverse Problems
8.3 Interpolation and Conditional Densities
8.4 Covariance or Precision?
8.5 Some Computed Examples
8.5.1 Bayesian Interpolation of Multi-Dimensional Data
8.5.2 Posterior Density by Low-Rank Updating
978-3-031-23824-6_9
9 Iterative Linear Solvers and Priorconditioners
9.1 Iterative Methods in Linear Algebra
9.2 Krylov Subspace Iterative Methods
9.2.1 Conjugate Gradient Algorithm
9.2.2 Conjugate Gradient Method for Least Squares
9.3 Ill-Conditioning and Errors in the Data
9.4 Iterative Solvers in the Bayesian Framework
9.4.1 Preconditioning and Tikhonov Regularization
9.4.2 Priorconditioners: Specially Chosen Preconditioners
9.4.3 Stopping Rule Revisited
978-3-031-23824-6_10
10 Hierarchical Models and Bayesian Sparsity
10.1 Posterior Densities with Conditionally Gaussian Priors
10.1.1 IAS, Sparsity, Sensitivity Weighting and Exchangeability
10.1.2 IAS with Priorconditioned CGLS
10.2 More General Sparse Representations
10.3 Some Examples
978-3-031-23824-6_11
11 Sampling: The Real Thing
11.1 Preliminaries: Markov Chains and Random Walks
11.1.1 An Introductory Example
11.1.2 Random Walks in mathbbRn
11.2 Metropolis–Hastings Algorithm
11.2.1 Balance and Detailed Balance Equations
11.2.2 Construction of the MH Transition
11.2.3 Metropolis–Hastings in Action
11.3 Gibbs Sampler
11.4 Preconditioned Crank–Nicholson
978-3-031-23824-6_12
12 Dynamic Methods and Learning from the Past
12.1 The Dog and the Hunter
12.2 Sampling Importance Resampling (SIR)
12.2.1 Survival of the Fittest
12.2.2 Estimation of Static Parameters
978-3-031-23824-6_13
13 Bayesian Filtering for Gaussian Densities
13.1 Kalman Filtering
13.2 The Best of Two Worlds: Ensemble Kalman Filtering
13.2.1 Adding Unknown Parameters
1 (1)
Appendix References
Index