Expectation Maximization Algorithm Application in Gaussian Mixture Models
View/ Open
Date
2022Author
Shiroko, Priscillah A
Type
ThesisLanguage
enMetadata
Show full item recordAbstract
Gaussian mixture models are applied in machine learning specifically unsupervised
machine learning. More specifically they can be used during image segmentation
and music classification just to mention a few.
In this project, it is shown how the EM Algorithm is derived and how it effectively
comes into use in terms of soft clustering data sets into distributions.
EM Algorithm is used to estimate parameters within a model in a fast and
stable way then fills the missing data in a sample and find the values of latent
variables.
The Gaussian Mixture model looks at the distributions. It groups only data
points that belong to a similar distribution. This is done through soft clustering
where by the points are assigned the probability of being in a certain
distribution, It goes as far as clustering data points in between different distributions
accurately by showing to which extent a data point falls in a particular
distribution.
Expectation Maximum Algorithm uses the observed data to get optimum values
that can be used to generate the model parameters.
Limitations anticipated within this study include;
. Expectation Maximum Algorithms have slow convergence and this convergence
is made to the local optima.
. It also requires forward and backward probabilities, while numerical optimization
only requires forward probability.
Publisher
University of Nairobi
Rights
Attribution-NonCommercial-NoDerivs 3.0 United StatesUsage Rights
http://creativecommons.org/licenses/by-nc-nd/3.0/us/Collections
The following license files are associated with this item: