This document provides an overview of possible 3D gestures that could be used in gestural user interfaces. It divides gestures into upper body gestures (involving movement above the waist), lower body gestures (below the waist), and full body gestures. Within each category, gestures are further divided into static poses and dynamic movements. Over 100 specific gestures are defined, with descriptions of possible functions they could activate such as playing/pausing media, scrolling, selecting options, and changing properties like volume. The document was created by the Embedded Interaction Lab team with input from IBM Research to explore gestures for social acceptance.