This document summarizes a project to control a virtual human using gestures recognized by the Microsoft Kinect sensor. The Kinect tracks users' joint positions and gestures are recognized by comparing relative joint locations. Gestures like moving the hands up and down control the virtual human's heart rate and blood pressure. Performing CPR is recognized by moving the hands in and out towards the torso. The virtual human then displays appropriate animations and reactions based on its physiological state.