Из серии Foundations and Trends in Signal Processing издательства NOWPress, 2011, -76 pp.
Про алгоритм максимума правдоподобия (expectation maximization) – описание, тренинг, приложения
This introduction to the expectation–maximization (EM) algorithm provides an intuitive and mathematically rigorous understanding of EM. Two of the most popular applications of EM are described in detail: estimating Gaussian mixture models (GMMs), and estimating hidden Markov models (HMMs). EM solutions are also derived for learning an optimal mixture of fixed models, for estimating the parameters of a compound Dirichlet distribution, and for dis-entangling superimposed signals. Practical issues that arise in the use of EM are discussed, as well as variants of the algorithm that help deal with these challenges.
The Expectation-Maximization MethodThe EM Algorithm
Contrasting EM with a Simple Variant
Using a Prior with EM (MAP EM)
Specifying the Complete Data
A Toy Example
Analysis of EMConvergence
Maximization–Maximization
Learning MixturesLearning an Optimal Mixture of Fixed Models
Learning a GMM
Estimating a Constrained GMM
More EM ExamplesLearning a Hidden Markov Model
Estimating Multiple Transmitter Locations
Estimating a Compound Dirichlet Distribution
EM VariantsEM May Not Find the Global Optimum
EM May Not Simplify the Computation
SpeedWhen Maximizing the Likelihood Is Not the Goal
Conclusions and Some Historical Notes