This document summarizes a lecture on approximate inference methods in machine learning. It introduces inference problems in graphical models like computing likelihoods and marginals. Exact inference methods are limited to tree structures, while junction tree methods are exponentially expensive. Approximate inference methods discussed include belief propagation on loopy graphs, mean field approximation, and Gibbs sampling. The lecture then covers exponential family graphical models, mean parameterization, and the marginal polytope. It introduces variational inference as a general framework and discusses the Bethe variational problem as an approximation approach.