This document provides an overview of information theory and source coding. It defines key information theory concepts like entropy, which is a measure of uncertainty or average information content. Entropy is calculated based on the probabilities of different messages from an information source. The document also discusses discrete memoryless sources, which independently and identically generate discrete symbols. The entropy of a discrete memoryless source represents the average information per message. Extended entropy is defined as the entropy of blocks of symbols from the source.