This document discusses collective communications in MPI (Message Passing Interface). It provides examples of collective communication routines like broadcast, scatter, gather, reduce, and scan. It also presents three programs that use collective communication to calculate pi: one each in C, Fortran, and C++. The programs decompose the problem across processes, calculate a local sum, and use a reduce operation to combine results. The document also briefly discusses the master-slave paradigm and multiplying matrices using collective operations.