This document provides an overview and schedule for a course on developing gesture-based natural user interfaces using the Kinect sensor. The course aims to teach students how to use programming environments like Processing, Pure Data, and Unity to build interactive applications involving full-body interaction. Students will first work on small individual projects, then collaborate in groups on a larger final project to be exhibited at the end of the semester. The document also covers topics like the components of a user interface, the evolution of interface styles, and how the Kinect sensor works using infrared and skeletal tracking.