The study presented at the 11th ACM SIGCHI Symposium focuses on developing an ontology for reasoning about body-based gestures captured by a Kinect sensor, aimed at enhancing automated gesture interpretation. A gesture elicitation study involving 24 participants led to the collection of 456 gestures, categorized into 23 types, and provided insights into user satisfaction and gesture agreement rates. The resulting ontology includes key classes related to users, sensors, and detected gestures, expressed in OWL for broader applicability.