Expectation Maximization Algorithm Application in Gaussian Mixture Models
dc.contributor.author | Shiroko, Priscillah A | |
dc.date.accessioned | 2023-02-15T07:00:49Z | |
dc.date.available | 2023-02-15T07:00:49Z | |
dc.date.issued | 2022 | |
dc.identifier.uri | http://erepository.uonbi.ac.ke/handle/11295/162518 | |
dc.description.abstract | Gaussian mixture models are applied in machine learning specifically unsupervised machine learning. More specifically they can be used during image segmentation and music classification just to mention a few. In this project, it is shown how the EM Algorithm is derived and how it effectively comes into use in terms of soft clustering data sets into distributions. EM Algorithm is used to estimate parameters within a model in a fast and stable way then fills the missing data in a sample and find the values of latent variables. The Gaussian Mixture model looks at the distributions. It groups only data points that belong to a similar distribution. This is done through soft clustering where by the points are assigned the probability of being in a certain distribution, It goes as far as clustering data points in between different distributions accurately by showing to which extent a data point falls in a particular distribution. Expectation Maximum Algorithm uses the observed data to get optimum values that can be used to generate the model parameters. Limitations anticipated within this study include; . Expectation Maximum Algorithms have slow convergence and this convergence is made to the local optima. . It also requires forward and backward probabilities, while numerical optimization only requires forward probability. | en_US |
dc.language.iso | en | en_US |
dc.publisher | University of Nairobi | en_US |
dc.rights | Attribution-NonCommercial-NoDerivs 3.0 United States | * |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/3.0/us/ | * |
dc.title | Expectation Maximization Algorithm Application in Gaussian Mixture Models | en_US |
dc.type | Thesis | en_US |