SlideShare a Scribd company logo
Wolfram Burgard, Cyrill Stachniss,
Kai Arras, Maren Bennewitz
Gaussian Mixture Models
Advanced Techniques
for Mobile Robotics
Recap K-Means
§  Can be applied for clustering data
§  Computes new centroids for the clusters in
an iterative manner
§  Converges to a local optimum
§  Uses a fixed variance
§  But the shapes of the clusters can be
different in reality!
2
Motivation
3image source: wikipedia
k-means Clustering based on a
mixture of Gaussians
Mixtures of Gaussians
§  Assume that the data points are generated by
sampling from a continuous function
§  A mixture of Gaussians is such a generative
model
§  K Gaussians with means and covariance
matrices
§  Each point is generated from one mixture
component (but we don’t know from which one)
§  Use mixing coefficients (probability that a
data point is generated from component k)
4
EM for Gaussian Mixtures
Models (GMMs)
§  E-step: Softly assign data points to mixture
components
§  Similar to k-means but considers mixing
coefficients
5
data point
mixture component
EM for Gaussian Mixtures
Models (GMMs)
§  E-step: Softly assign data points to mixture
components
§  M-step: Re-estimate the parameters for each
mixture component based on the soft assignments
6
where
“soft” assignments to k
(as in k-means)
EM with GMMs
7image source: C. M. Bishop
8
Properties of Gaussian Mixture
Models
§  Can represent any continuous distribution
§  Number of mixture components must be
estimated separately (as with k-means)
§  EM for GMMs is computationally more
expensive than for k-means
§  EM converges slower than for k-means
§  Results depend on the initialization
§  K-means can be used for initialization (to speed
up convergence and to find a “better” local
optimum)
Initialization with K-Means
§  Run k-means N times
§  Take best result (highest likelihood)
§  Use this result to initialize EM for the GMM
§  Set to the mean of cluster j from k-means
§  Set to the covariance of the data points
associated with cluster j
9
10
Further Reading
E. Alpaydin
Introduction to Machine Learning
C.M. Bishop
Pattern Recognition and Machine
Learning
J. A. Bilmes
A Gentle Tutorial of the EM algorithm and
its Applications to Parameter Estimation
(Technical report)

More Related Content

PDF
10766012 ranalitics
PDF
J. Park, AAAI 2022, MLILAB, KAIST AI
PPTX
SBSI optimization tutorial
DOCX
Design of Power and Area Efficient Approximate Multipliers
PDF
Car Accident Severity Report
PPTX
Superefficient Monte Carlo Simulations
PPT
Predicting Multiple Metrics for Queries: Better Decision Enabled by Machine L...
PPTX
T. Yoon, et. al., ICLR 2021, MLILAB, KAIST AI
10766012 ranalitics
J. Park, AAAI 2022, MLILAB, KAIST AI
SBSI optimization tutorial
Design of Power and Area Efficient Approximate Multipliers
Car Accident Severity Report
Superefficient Monte Carlo Simulations
Predicting Multiple Metrics for Queries: Better Decision Enabled by Machine L...
T. Yoon, et. al., ICLR 2021, MLILAB, KAIST AI

Similar to Advanced Techniques for Mobile Robotics (20)

PDF
13_Unsupervised Learning.pdf
PPTX
GMM Clustering Presentation Slides for Machine Learning Course
PDF
Clustering-beamer.pdf
PDF
Clustering:k-means, expect-maximization and gaussian mixture model
PPT
gmatrix distro_gmatrix distro_gmatrix distro
PPTX
Statistical Clustering Redux - kmeans, GMM and Variational Inference
PPTX
A popular clustering algorithm is known as K-means, which will follow an iter...
PDF
Machine learning ,supervised learning ,j
PPTX
PRML Chapter 9
PDF
Expectation Maximization and Gaussian Mixture Models
PDF
Elliptical Mixture Models Improve the Accuracy of Gaussian Mixture Models wit...
PDF
Machine learning (8)
PDF
Machine Learning
PDF
Cs229 notes7b
PPTX
Statistical Clustering
PDF
K-means and GMM
PPTX
Lecture 18: Gaussian Mixture Models and Expectation Maximization
PDF
PDF
11 clusadvanced
PPTX
Lecture 17: Supervised Learning Recap
13_Unsupervised Learning.pdf
GMM Clustering Presentation Slides for Machine Learning Course
Clustering-beamer.pdf
Clustering:k-means, expect-maximization and gaussian mixture model
gmatrix distro_gmatrix distro_gmatrix distro
Statistical Clustering Redux - kmeans, GMM and Variational Inference
A popular clustering algorithm is known as K-means, which will follow an iter...
Machine learning ,supervised learning ,j
PRML Chapter 9
Expectation Maximization and Gaussian Mixture Models
Elliptical Mixture Models Improve the Accuracy of Gaussian Mixture Models wit...
Machine learning (8)
Machine Learning
Cs229 notes7b
Statistical Clustering
K-means and GMM
Lecture 18: Gaussian Mixture Models and Expectation Maximization
11 clusadvanced
Lecture 17: Supervised Learning Recap
Ad

Recently uploaded (20)

PDF
Unlocking AI with Model Context Protocol (MCP)
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PPTX
MYSQL Presentation for SQL database connectivity
PDF
Approach and Philosophy of On baking technology
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Review of recent advances in non-invasive hemoglobin estimation
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PDF
Encapsulation theory and applications.pdf
PPTX
A Presentation on Artificial Intelligence
PDF
NewMind AI Weekly Chronicles - August'25-Week II
PPTX
Big Data Technologies - Introduction.pptx
PPTX
Machine Learning_overview_presentation.pptx
PDF
A comparative analysis of optical character recognition models for extracting...
PDF
Machine learning based COVID-19 study performance prediction
PPTX
Spectroscopy.pptx food analysis technology
PDF
Assigned Numbers - 2025 - Bluetooth® Document
PPTX
sap open course for s4hana steps from ECC to s4
Unlocking AI with Model Context Protocol (MCP)
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Advanced methodologies resolving dimensionality complications for autism neur...
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
“AI and Expert System Decision Support & Business Intelligence Systems”
MYSQL Presentation for SQL database connectivity
Approach and Philosophy of On baking technology
The AUB Centre for AI in Media Proposal.docx
Review of recent advances in non-invasive hemoglobin estimation
Chapter 3 Spatial Domain Image Processing.pdf
Encapsulation theory and applications.pdf
A Presentation on Artificial Intelligence
NewMind AI Weekly Chronicles - August'25-Week II
Big Data Technologies - Introduction.pptx
Machine Learning_overview_presentation.pptx
A comparative analysis of optical character recognition models for extracting...
Machine learning based COVID-19 study performance prediction
Spectroscopy.pptx food analysis technology
Assigned Numbers - 2025 - Bluetooth® Document
sap open course for s4hana steps from ECC to s4
Ad

Advanced Techniques for Mobile Robotics

  • 1. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz Gaussian Mixture Models Advanced Techniques for Mobile Robotics
  • 2. Recap K-Means §  Can be applied for clustering data §  Computes new centroids for the clusters in an iterative manner §  Converges to a local optimum §  Uses a fixed variance §  But the shapes of the clusters can be different in reality! 2
  • 3. Motivation 3image source: wikipedia k-means Clustering based on a mixture of Gaussians
  • 4. Mixtures of Gaussians §  Assume that the data points are generated by sampling from a continuous function §  A mixture of Gaussians is such a generative model §  K Gaussians with means and covariance matrices §  Each point is generated from one mixture component (but we don’t know from which one) §  Use mixing coefficients (probability that a data point is generated from component k) 4
  • 5. EM for Gaussian Mixtures Models (GMMs) §  E-step: Softly assign data points to mixture components §  Similar to k-means but considers mixing coefficients 5 data point mixture component
  • 6. EM for Gaussian Mixtures Models (GMMs) §  E-step: Softly assign data points to mixture components §  M-step: Re-estimate the parameters for each mixture component based on the soft assignments 6 where “soft” assignments to k (as in k-means)
  • 7. EM with GMMs 7image source: C. M. Bishop
  • 8. 8 Properties of Gaussian Mixture Models §  Can represent any continuous distribution §  Number of mixture components must be estimated separately (as with k-means) §  EM for GMMs is computationally more expensive than for k-means §  EM converges slower than for k-means §  Results depend on the initialization §  K-means can be used for initialization (to speed up convergence and to find a “better” local optimum)
  • 9. Initialization with K-Means §  Run k-means N times §  Take best result (highest likelihood) §  Use this result to initialize EM for the GMM §  Set to the mean of cluster j from k-means §  Set to the covariance of the data points associated with cluster j 9
  • 10. 10 Further Reading E. Alpaydin Introduction to Machine Learning C.M. Bishop Pattern Recognition and Machine Learning J. A. Bilmes A Gentle Tutorial of the EM algorithm and its Applications to Parameter Estimation (Technical report)