Theory and Use of the EM Algorithm

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Из серии Foundations and Trends in Signal Processing издательства NOWPress, 2011, -76 pp.
Про алгоритм максимума правдоподобия (expectation maximization) – описание, тренинг, приложения
This introduction to the expectation–maximization (EM) algorithm provides an intuitive and mathematically rigorous understanding of EM. Two of the most popular applications of EM are described in detail: estimating Gaussian mixture models (GMMs), and estimating hidden Markov models (HMMs). EM solutions are also derived for learning an optimal mixture of fixed models, for estimating the parameters of a compound Dirichlet distribution, and for dis-entangling superimposed signals. Practical issues that arise in the use of EM are discussed, as well as variants of the algorithm that help deal with these challenges.
The Expectation-Maximization Method
The EM Algorithm
Contrasting EM with a Simple Variant
Using a Prior with EM (MAP EM)
Specifying the Complete Data
A Toy Example
Analysis of EM
Convergence
Maximization–Maximization
Learning Mixtures
Learning an Optimal Mixture of Fixed Models
Learning a GMM
Estimating a Constrained GMM
More EM Examples
Learning a Hidden Markov Model
Estimating Multiple Transmitter Locations
Estimating a Compound Dirichlet Distribution
EM Variants
EM May Not Find the Global Optimum
EM May Not Simplify the Computation
Speed
When Maximizing the Likelihood Is Not the Goal
Conclusions and Some Historical Notes

Author(s): Gupta M.R., Chen Y.

Language: English
Commentary: 544281
Tags: Приборостроение;Обработка сигналов