This document discusses conjugate gradient methods for minimizing quadratic functions. It begins by introducing quadratic functions and noting that conjugate gradient methods can minimize them without needing the full Hessian matrix, unlike Newton's method. It then defines what it means for a set of vectors to be conjugate with respect to a positive definite matrix A. Vectors are conjugate if their inner products with respect to A are all zero. The document proves that a set of conjugate vectors forms a basis and describes a simple conjugate gradient algorithm that finds the minimum in n iterations using n conjugate search directions.