This document discusses K-means clustering, Expectation Maximization (EM), and Gaussian mixture models (GMM). It begins with an overview of unsupervised learning and introduces K-means as a simple clustering algorithm. It then describes EM as a general algorithm for maximum likelihood estimation that can be applied to problems like GMM. GMM is presented as a density estimation technique that models data using a weighted sum of Gaussian distributions. EM is described as a method for estimating the parameters of a GMM from data.
Related topics: