Probabilistic Graphical Models: Principles and Applications

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This fully updated new edition of a uniquely accessible textbook/reference provides a general introduction to probabilistic graphical models (PGMs) from an engineering perspective. It features new material on partially observable Markov decision processes, graphical models, and deep learning, as well as an even greater number of exercises. The book covers the fundamentals for each of the main classes of PGMs, including representation, inference and learning principles, and reviews real-world applications for each type of model. These applications are drawn from a broad range of disciplines, highlighting the many uses of Bayesian classifiers, hidden Markov models, Bayesian networks, dynamic and temporal Bayesian networks, Markov random fields, influence diagrams, and Markov decision processes. Topics and features: Presents a unified framework encompassing all of the main classes of PGMs Explores the fundamental aspects of representation, inference and learning for each technique Examines new material on partially observable Markov decision processes, and graphical models Includes a new chapter introducing deep neural networks and their relation with probabilistic graphical models Covers multidimensional Bayesian classifiers, relational graphical models, and causal models Provides substantial chapter-ending exercises, suggestions for further reading, and ideas for research or programming projects Describes classifiers such as Gaussian Naive Bayes, Circular Chain Classifiers, and Hierarchical Classifiers with Bayesian Networks Outlines the practical application of the different techniques Suggests possible course outlines for instructors This classroom-tested work is suitable as a textbook for an advanced undergraduate or a graduate course in probabilistic graphical models for students of computer science, engineering, and physics. Professionals wishing to apply probabilistic graphical models in their own field, or interested in the basis of these techniques, will also find the book to be an invaluable reference.

Author(s): Luis Enrique Sucar
Series: Advances in Computer Vision and Pattern Recognition
Publisher: Springer
Year: 2021

Language: English
Pages: 355
City: Cham

Foreword
Preface
Acknowledgements
Contents
Acronyms
Notation
Part I Fundamentals
1 Introduction
1.1 Uncertainty
1.1.1 Effects of Uncertainty
1.2 A Brief History
1.3 Basic Probabilistic Models
1.3.1 An Example
1.4 Probabilistic Graphical Models
1.5 Representation, Inference and Learning
1.6 Applications
1.7 Overview of the Book
1.8 Additional Reading
References
2 Probability Theory
2.1 Introduction
2.2 Basic Rules
2.3 Random Variables
2.3.1 Two Dimensional Random Variables
2.4 Information Theory
2.5 Additional Reading
2.6 Exercises
References
3 Graph Theory
3.1 Definitions
3.2 Types of Graphs
3.3 Trajectories and Circuits
3.4 Graph Isomorphism
3.5 Trees
3.6 Cliques
3.7 Perfect Ordering
3.8 Ordering and Triangulation Algorithms
3.8.1 Maximum Cardinality Search
3.8.2 Graph Filling
3.9 Additional Reading
3.10 Exercises
References
Part II Probabilistic Models
4 Bayesian Classifiers
4.1 Introduction
4.1.1 Classifier Evaluation
4.2 Bayesian Classifier
4.2.1 Naive Bayesian Classifier
4.3 Gaussian Naive Bayes
4.4 Alternative Models: TAN, BAN
4.5 Semi-naive Bayesian Classifiers
4.6 Multidimensional Bayesian Classifiers
4.6.1 Multidimensional Bayesian Network Classifiers
4.6.2 Chain Classifiers
4.7 Hierarchical Classification
4.7.1 Chained Path Evaluation
4.7.2 Hierarchical Classification with Bayesian Networks
4.8 Applications
4.8.1 Visual Skin Detection
4.8.2 HIV Drug Selection
4.9 Additional Reading
4.10 Exercises
References
5 Hidden Markov Models
5.1 Introduction
5.2 Markov Chains
5.2.1 Parameter Estimation
5.2.2 Convergence
5.3 Hidden Markov Models
5.3.1 Evaluation
5.3.2 State Estimation
5.3.3 Learning
5.3.4 Gaussian Hidden Markov Models
5.3.5 Extensions
5.4 Applications
5.4.1 PageRank
5.4.2 Gesture Recognition
5.5 Additional Reading
5.6 Exercises
References
6 Markov Random Fields
6.1 Introduction
6.2 Markov Random Fields
6.2.1 Regular Markov Random Fields
6.3 Gibbs Random Fields
6.4 Inference
6.5 Parameter Estimation
6.5.1 Parameter Estimation with Labeled Data
6.6 Conditional Random Fields
6.7 Applications
6.7.1 Image Smoothing
6.7.2 Improving Image Annotation
6.8 Additional Reading
6.9 Exercises
References
7 Bayesian Networks: Representation and Inference
7.1 Introduction
7.2 Representation
7.2.1 Structure
7.2.2 Parameters
7.3 Inference
7.3.1 Singly Connected Networks: Belief Propagation
7.3.2 Multiple Connected Networks
7.3.3 Approximate Inference
7.3.4 Most Probable Explanation
7.3.5 Continuous Variables
7.4 Applications
7.4.1 Information Validation
7.4.2 Reliability Analysis
7.5 Additional Reading
7.6 Exercises
References
8 Bayesian Networks: Learning
8.1 Introduction
8.2 Parameter Learning
8.2.1 Smoothing
8.2.2 Parameter Uncertainty
8.2.3 Missing Data
8.2.4 Discretization
8.3 Structure Learning
8.3.1 Tree Learning
8.3.2 Learning a Polytree
8.3.3 Search and Score Techniques
8.3.4 Independence Tests Techniques
8.4 Combining Expert Knowledge and Data
8.5 Transfer Learning
8.6 Applications
8.6.1 Air Pollution Model for Mexico City
8.6.2 Agricultural Planning Using Bayesian Networks
8.7 Additional Reading
8.8 Exercises
References
9 Dynamic and Temporal Bayesian Networks
9.1 Introduction
9.2 Dynamic Bayesian Networks
9.2.1 Inference
9.2.2 Sampling
9.2.3 Learning
9.2.4 Dynamic Bayesian Network Classifiers
9.3 Temporal Event Networks
9.3.1 Temporal Nodes Bayesian Networks
9.4 Applications
9.4.1 DBN: Gesture Recognition
9.4.2 TNBN: Predicting HIV Mutational Pathways
9.5 Additional Reading
9.6 Exercises
References
Part III Decision Models
10 Decision Graphs
10.1 Introduction
10.2 Decision Theory
10.2.1 Fundamentals
10.3 Decision Trees
10.4 Influence Diagrams
10.4.1 Modeling
10.4.2 Evaluation
10.4.3 Extensions
10.5 Applications
10.5.1 Decision Support System for Lung Cancer
10.5.2 Decision-Theoretic Caregiver
10.6 Additional Reading
10.7 Exercises
References
11 Markov Decision Processes
11.1 Introduction
11.2 Modeling
11.3 Evaluation
11.3.1 Value Iteration
11.3.2 Policy Iteration
11.3.3 Complexity Analysis
11.4 Factored MDPs
11.4.1 Abstraction
11.4.2 Decomposition
11.5 Applications
11.5.1 Power Plant Operation
11.5.2 Robot Task Coordination
11.6 Additional Reading
11.7 Exercises
References
12 Partially Observable Markov Decision Processes
12.1 Introduction
12.2 Representation
12.3 Solution Techniques
12.3.1 Value Functions
12.3.2 Solution Algorithms
12.4 Applications
12.4.1 Automatic Adaptation in Virtual Rehabilitation
12.4.2 Hierarchical POMDPs for Task Planning in Robotics
12.5 Additional Reading
12.6 Exercises
References
Part IV Relational, Causal and Deep Models
13 Relational Probabilistic Graphical Models
13.1 Introduction
13.2 Logic
13.2.1 Propositional Logic
13.2.2 First-Order Predicate Logic
13.3 Probabilistic Relational Models
13.3.1 Inference
13.3.2 Learning
13.4 Markov Logic Networks
13.4.1 Inference
13.4.2 Learning
13.5 Applications
13.5.1 Student Modeling
13.5.2 Visual Grammars
13.6 Additional Reading
13.7 Exercises
References
14 Graphical Causal Models
14.1 Introduction
14.1.1 Definition of Causality
14.2 Causal Bayesian Networks
14.2.1 Gaussian Linear Models
14.3 Causal Reasoning
14.3.1 Prediction
14.3.2 Counterfactuals
14.4 Front Door and Back Door Criterion
14.4.1 Back Door Criterion
14.4.2 Front Door Criterion
14.5 Applications
14.5.1 Characterizing Patterns of Unfairness
14.5.2 Accelerating Reinforcement Learning with Causal Models
14.6 Additional Reading
14.7 Exercises
References
15 Causal Discovery
15.1 Introduction
15.2 Types of Graphs
15.2.1 Markov Equivalence Classes Under Causal Sufficiency
15.2.2 Markov Equivalence Classes with Unmeasured Variables
15.3 Causal Discovery Algorithms
15.3.1 Score-Based Causal Discovery
15.3.2 Constraint-Based Causal Discovery
15.3.3 Casual Discovery with Linear Models
15.4 Applications
15.4.1 Learning a Causal Model for ADHD
15.4.2 Decoding Brain Effective Connectivity Based on fNIRS
15.5 Additional Reading
15.6 Exercises
References
16 Deep Learning and Graphical Models
16.1 Introduction
16.2 Review of Neural Networks and Deep Learning
16.2.1 A Brief History
16.2.2 Deep Neural Networks
16.3 Graphical Models and Neural Networks
16.3.1 Naives Bayes Classifiers Versus Perceptrons
16.3.2 Bayesian Networks Versus Multi-layer Neural Networks
16.4 Hybrid Models
16.4.1 Testing Bayesian Networks
16.4.2 Integrating Graphical and Deep Models
16.5 Applications
16.5.1 Human Body Pose Tracking
16.5.2 Neural Enhanced Belief Propagation for Error Correction
16.6 Additional Reading
16.7 Exercises
References
Appendix A A Python Library for Inference and Learning
A.1 Introduction
A.2 Requirements
A.3 Installation
Appendix Glossary
Index