In this work we propose a clustering algorithm that learns on-line a finite gaussian mixture model from multivariate data based on the expectation maximization approach. The convergence of the right number of components as well as their means and covariances is achieved without requiring any careful initialization. Our methodology starts from a single mixture component covering the whole data set and sequentially splits it incrementally during the expectation maximization steps. Once the stopping criteria has been reached, the classical EM algorithm with the best selected mixture is run in order to optimize the solution. We show the effectiveness of the method in a series of simulated experiments and compare in with a state-of-the-art alternative technique both with synthetic data and real images, including experiments with the iCub humanoid robot.
Unsupervised Learning of Finite Gaussian Mixture Models (GMMs): A Greedy Approach
01.01.2011
16 pages
Aufsatz/Kapitel (Buch)
Elektronische Ressource
Englisch
Unsupervised Learning of Finite Gaussian Mixture Models (GMMs): A Greedy Approach
British Library Conference Proceedings | 2011
|Unsupervised Learning for Finite Mixture Models Based on a Modified Gibbs Sampling
British Library Online Contents | 2009
|Finite asymmetric generalized Gaussian mixture models learning for infrared object detection
British Library Online Contents | 2013
|