Building Probabilistic Graphical Models with Python

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This is a short, practical guide that allows data scientists to understand the concepts of Graphical models and enables them to try them out using small Python code snippets, without being too mathematically complicated. If you are a data scientist who knows about machine learning and want to enhance your knowledge of graphical models, such as Bayes network, in order to use them to solve real-world problems using Python libraries, this book is for you.This book is intended for those who have some Python and machine learning experience, or are exploring the machine learning field.

Author(s): Kiran R Karkera
Publisher: Packt Publishing
Year: 2014

Language: English
Pages: 172

Cover
Copyright
Credits
About the Author
About the Reviewers
www.PacktPub.com
Table of Contents
Preface
Chapter 1: Probability
The theory of probability
Goals of probabilistic inference
Conditional probability
The chain rule
The Bayes rule
Interpretations of probability
Random variables
Marginal distribution
Joint distribution
Independence
Conditional independence
Types of queries
Probability queries
MAP queries
Summary
Chapter 2: Directed Graphical Models
Graph terminology
Python digression
Independence and independent parameters
The Bayes network
The chain rule
Reasoning patterns
Causal reasoning
Evidential reasoning
Inter-causal reasoning
D-separation
The D-separation example
Blocking and unblocking a V-structure
Factorization and I-maps
The Naive Bayes model
The Naïve Bayes example
Summary
Chapter 3: Undirected Graphical Models
Pairwise Markov networks
The Gibbs distribution
An induced Markov network
Factorization
Flow of influence
Active trail and separation
Structured prediction
Problem of correlated features
The CRF representation
The CRF example
The factorization-independence tango
Summary
Chapter 4: Structure Learning
The structure learning landscape
Constraint-based structure learning
Part I
Part II
Part III
Summary of constraint-based approaches
Score-based learning
The likelihood score
The Bayesian information criterion score
The Bayesian score
Summary of score-based learning
Summary
Chapter 5: Parameter Learning
The likelihood function
Parameter learning example using MLE
MLE for Bayesian networks
Bayesian parameter learning example using MLE
Data fragmentation
Effect of data fragmentation on parameter estimation
Bayesian parameter estimation
An example of Bayesian methods for parameter learning
Bayesian estimation for the Bayesian network
Example of Bayesian estimation
Summary
Chapter 6: Exact Inference Using Graphical Models
Complexity of inference
Real-world issues
Using the Variable Elimination algorithm
Marginalizing factors that are not relevant
Factor reduction to filter on evidence
Shortcomings of the brute-force approach
Using the Variable Elimination approach
Complexity of Variable Elimination
Graph perspective
Learning the induced width from the graph structure
The tree algorithm
The four stages of the junction tree algorithm
Using the junction tree algorithm for inference
Stage 1.1 – moralization
Stage 1.2 – triangulation
Stage 1.3 – building the join tree
Stage 2 – initializing potentials
Stage 3 – message passing
Summary
Chapter 7: Approximate
Inference Methods
The optimization perspective
Belief propagation on general graphs
Creating a cluster graph to run LBP
Message passing in LBP
Steps in the LBP algorithm
Improving the convergence of LBP
Applying LBP to segment an image
Understanding energy-based models
Visualizing unary and pairwise factors on a 3 x 3 grid
Creating the model for image segmentation
Applications of LBP
Sampling-based methods
Forward sampling
The accept-reject sampling method
The Markov Chain Monte Carlo sampling process
The Markov property
The Markov chain
Reaching a steady state
Sampling using a Markov chain
Gibbs sampling
Steps in the Gibbs sampling procedure
An example of Gibbs sampling
Summary
Appendix: References
Index