This document provides an introduction to Hidden Markov Models (HMMs). It describes HMMs as statistical models that use hidden states to generate observable sequences/outputs. Three key problems with HMMs are discussed: evaluation, decoding, and training. Algorithms to solve each problem are also presented, including the Forward-Backward algorithm, Viterbi algorithm, and Baum-Welch algorithm. Relationships between HMM concepts and probability terms like prior, posterior, and evidence are also explained.