This document provides an overview of probabilistic topic models. It discusses latent Dirichlet allocation (LDA), a commonly used topic model, and how it represents documents as mixtures over latent topics and words as generated by those topics. Parameter selection and inference algorithms for LDA are also summarized. Evaluation methods for topic models like held-out likelihood, topic coherence, and topic intrusion are outlined to assess how well models fit data and how interpretable topics are for humans.