SlideShare a Scribd company logo
The 11th ACM SIGCHI Symposium on Engineering
Interactive Computing Systems
18-21 June, 2019 - Valencia, Spain
An Ontology for Reasoning on Body-based Gestures
Mehdi Ousmer, Jean Vanderdonckt,
Université catholique de Louvain,
Belgium
Sabin Buraga,
Alexandru Ioan Cuza University of Iasi,
Romania
Results
The Kinect sensor is widely used for different users, environments, and applications. The purpose of this work is to design
tools to structure gestures for automated reasoning of body-based gestures, acquired by Kinect sensor. Thus, we come up
with a sensor-independent ontology. The ontology is based on different parts with intrinsic and extrinsic properties. To
establish this proposal, a gesture elicitation study was conducted which fed the ontology with 456 elicited gestures.
The ontology
0
5
10
15
20
25
7654321
Device frequency of usage
Computer Smartphone Tablet Game Kinect
Feeding the ontology
Participants:
•Twenty-four participants (11 Females, 13 Males)
•Participants’ average age is 34.5 (from 23 to 53 Years)
•Voluntary participants with different occupations
•Most of them were not familiar with the Kinect sensor
Apparatus:
•The experiment was in a controlled environment (
usability laboratory)
•Showing referent to the participant
• Time how long it takes to think
•Gesture recorded by a camera and a Kinect sensor
•Ask the participant to rate the goodness-of-fit
References:
[1] Radu-Daniel Vatavu and Jacob O. Wobbrock. 2015. Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit. In Proc. of CHI ’15. 1325–1334.
[2] Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-defined Gestures for Surface Computing. In Proc. of CHI ’09. 1083–1092.
Experiment:
•Gesture elicitation study based on original methodology
[1][2]
•Nineteen referents from IoT actions
•Referents are divided into three parts (Unary/Binary/Linear)
11.54
11.17
17.14
7.90
8.13
8.31
8.36
9.27
10.34
8.40
6.74
10.63
10.83
7.63
9.99
17.31
11.96
5.85
7.45
0.00 2.00 4.00 6.00 8.00 10.00 12.00 14.00 16.00 18.00 20.00
Turn TV On
Turn TV Off
Start Player
Increase Volume
Decrease Volume
Go to Next Item in a List
Go to Previous Item in a List
Turn Air Conditioning On
Turn Air Conditioning OFF
Turn Light On
Turn Light Off
Brighten Lights
Dim Lights
Turn Heating System On
Turn Heating System Off
Turn Alarm On
Turn Alarm Off
Answer Phone Call
End Phone Call
Thinking times (sec.)
7.54
7.21
7.25
8.04
7.50
7.71
7.46
6.92
6.79
7.58
7.75
6.83
7.04
7.75
7.25
6.63
6.88
7.75
8.08
0.002.004.006.008.00
Goodness of Fit (1-10)
Agreement rate:
•Agreement rate computed by AGaTE[1]
•In average magnitude, agreement rates are Medium
•Most referents belong to the medium range
•Similar to other rates in literature
Classification:
•Half of the gestures were made in a
medium range
•Dominant hand was most used
•The study resulted in 53 unique gestures
go in 23 categories of gestures
•No particular correlation
between thinking time for pairs
of complementary referents
•The value of goodness of fit
are above the average values,
which indicates that
participants were satisfied by
the gestures they chose
User satisfaction:
•Subjective user satisfaction using the IBM PSSUQ questionnaire
•Participants were subjectively satisfied with body-based gestures
•All values of measures are superior to 5 on a scale of 1 to 7
Conclusion:
•A base to describe body-based gestures for reasoning tasks
(e.g. Beyond querying )
•Potential benefits from the ontology (Reusability, Coherence,
Extensibility, …)
•The RDF defines 3 main classes:
•User: Information about user and anatomical body parts and
joints.
•Sensor: Raw data provided by a device.
•Detected gestures and poses: Includes Gesture, GestureSegment,
HandState, and Pose.
•Representation of the body-based gesture in the context of use
•The ontology based on user, sensor and physical environment
•Expressed in OWL (Ontology Web Language), a W3C standard
according to the triples <subject, predicate, object>
•Joints: Objects that could be compared respecting their
position in a Cartesian spaces
• Segment: Merged joints which specify body pose
•Pose: the basic part of a gesture which can be manipulated
using logical constructs (And/Or/Not)

More Related Content

PPTX
Attention Approximation: From the web to multi-screen television
PDF
Object segmentation in images using EEG signals
PDF
(Spring 2013) Fingerprint Interaction Consistency
PPTX
Taylor: Estimating uncertainty for continental scale measurements.
PDF
UCAmI Presentation Dec.2013, Guanacaste, Costa Rica
PDF
A robust method for VR-based hand gesture recognition using density-based CNN
PDF
Review on Hand Gesture Recognition
PPTX
On the Development of A Real-Time Multi-Sensor Activity Recognition System
Attention Approximation: From the web to multi-screen television
Object segmentation in images using EEG signals
(Spring 2013) Fingerprint Interaction Consistency
Taylor: Estimating uncertainty for continental scale measurements.
UCAmI Presentation Dec.2013, Guanacaste, Costa Rica
A robust method for VR-based hand gesture recognition using density-based CNN
Review on Hand Gesture Recognition
On the Development of A Real-Time Multi-Sensor Activity Recognition System

Similar to An ontology for reasoning on body-based gestures (20)

PDF
Indian Sign Language Recognition using Vision Transformer based Convolutional...
DOCX
STEFANO CARRINO
PDF
IoT Based Human Activity Recognition and Classification Using Machine Learning
PDF
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
PDF
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
PPTX
Gesture Recognition Technology
PPTX
PDF
IRJET- Survey on Sign Language and Gesture Recognition System
PDF
Communityday2013
PDF
Smart Room Gesture Control
PDF
IRJET- A Survey on Control of Mechanical ARM based on Hand Gesture Recognitio...
PDF
Virtual Yoga System Using Kinect Sensor
PDF
Hand LightWeightNet: an optimized hand pose estimation for interactive mobile...
PDF
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
PDF
Measuring human behaviour by sensing everyday mobile interactions
PPTX
Human Activity Recognition in Android
PDF
Gesture Recognition System
PPTX
Sensor node and Optimal node placement in Body Area Network.pptx
PDF
Patent Landscape Report on Hand Gesture Recognition by PatSeer Pro
PPTX
Indian Sign Language Recognition using Vision Transformer based Convolutional...
STEFANO CARRINO
IoT Based Human Activity Recognition and Classification Using Machine Learning
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Gesture Recognition Technology
IRJET- Survey on Sign Language and Gesture Recognition System
Communityday2013
Smart Room Gesture Control
IRJET- A Survey on Control of Mechanical ARM based on Hand Gesture Recognitio...
Virtual Yoga System Using Kinect Sensor
Hand LightWeightNet: an optimized hand pose estimation for interactive mobile...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Measuring human behaviour by sensing everyday mobile interactions
Human Activity Recognition in Android
Gesture Recognition System
Sensor node and Optimal node placement in Body Area Network.pptx
Patent Landscape Report on Hand Gesture Recognition by PatSeer Pro
Ad

More from Jean Vanderdonckt (20)

PPTX
https://dl.acm.org/doi/10.1145/3715336.3735706
PPTX
TapStrapGest: Elicitation and Recognition of Ring-based Multi-Finger Gestures
PPTX
Congruent and Hierarchical Gesture Set Design
PPTX
Paired Sketching of Distributed User Interfaces:Workflow, Protocol, Software ...
PPTX
Comparative Testing of 2D Stroke Gesture Recognizers in Multiple Contexts of Use
PPTX
Human-AI Interaction in Space: Insights from a Mars Analog Mission with the H...
PPTX
Gestural Interaction in Virtual/Augmented Reality
PPTX
User-controlled Form Adaptation by Unsupervised Learning
PPTX
Engineering Touchscreen Input for 3-Way Displays: Taxonomy, Datasets, and Cla...
PPTX
To the end of our possibilities with Adaptive User Interfaces
PPTX
Engineering the Transition of Interactive Collaborative Software from Cloud C...
PPTX
UsyBus: A Communication Framework among Reusable Agents integrating Eye-Track...
PPTX
µV: An Articulation, Rotation, Scaling, and Translation Invariant (ARST) Mult...
PPTX
RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowle...
PPTX
Gesture-based information systems: from DesignOps to DevOps
PPTX
Engineering Slidable User Interfaces with Slime
PPTX
Evaluating Gestural Interaction: Models, Methods, and Measures
PPTX
Conducting a Gesture Elicitation Study: How to Get the Best Gestures From Peo...
PPTX
Designing Gestural Interaction: Challenges and Pitfalls
PPTX
Fundamentals of Gestural Interaction
https://dl.acm.org/doi/10.1145/3715336.3735706
TapStrapGest: Elicitation and Recognition of Ring-based Multi-Finger Gestures
Congruent and Hierarchical Gesture Set Design
Paired Sketching of Distributed User Interfaces:Workflow, Protocol, Software ...
Comparative Testing of 2D Stroke Gesture Recognizers in Multiple Contexts of Use
Human-AI Interaction in Space: Insights from a Mars Analog Mission with the H...
Gestural Interaction in Virtual/Augmented Reality
User-controlled Form Adaptation by Unsupervised Learning
Engineering Touchscreen Input for 3-Way Displays: Taxonomy, Datasets, and Cla...
To the end of our possibilities with Adaptive User Interfaces
Engineering the Transition of Interactive Collaborative Software from Cloud C...
UsyBus: A Communication Framework among Reusable Agents integrating Eye-Track...
µV: An Articulation, Rotation, Scaling, and Translation Invariant (ARST) Mult...
RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowle...
Gesture-based information systems: from DesignOps to DevOps
Engineering Slidable User Interfaces with Slime
Evaluating Gestural Interaction: Models, Methods, and Measures
Conducting a Gesture Elicitation Study: How to Get the Best Gestures From Peo...
Designing Gestural Interaction: Challenges and Pitfalls
Fundamentals of Gestural Interaction
Ad

Recently uploaded (20)

PDF
Raksha Bandhan Grocery Pricing Trends in India 2025.pdf
PDF
Internet Downloader Manager (IDM) Crack 6.42 Build 41
PPTX
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
PPTX
Operating system designcfffgfgggggggvggggggggg
PDF
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
PDF
Navsoft: AI-Powered Business Solutions & Custom Software Development
PPTX
history of c programming in notes for students .pptx
PDF
Audit Checklist Design Aligning with ISO, IATF, and Industry Standards — Omne...
PDF
top salesforce developer skills in 2025.pdf
PDF
Claude Code: Everyone is a 10x Developer - A Comprehensive AI-Powered CLI Tool
PDF
Odoo Companies in India – Driving Business Transformation.pdf
PPTX
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
PPTX
ManageIQ - Sprint 268 Review - Slide Deck
PDF
AI in Product Development-omnex systems
PDF
2025 Textile ERP Trends: SAP, Odoo & Oracle
PPTX
VVF-Customer-Presentation2025-Ver1.9.pptx
PDF
Softaken Excel to vCard Converter Software.pdf
PDF
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
PPT
Introduction Database Management System for Course Database
PDF
Nekopoi APK 2025 free lastest update
Raksha Bandhan Grocery Pricing Trends in India 2025.pdf
Internet Downloader Manager (IDM) Crack 6.42 Build 41
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
Operating system designcfffgfgggggggvggggggggg
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
Navsoft: AI-Powered Business Solutions & Custom Software Development
history of c programming in notes for students .pptx
Audit Checklist Design Aligning with ISO, IATF, and Industry Standards — Omne...
top salesforce developer skills in 2025.pdf
Claude Code: Everyone is a 10x Developer - A Comprehensive AI-Powered CLI Tool
Odoo Companies in India – Driving Business Transformation.pdf
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
ManageIQ - Sprint 268 Review - Slide Deck
AI in Product Development-omnex systems
2025 Textile ERP Trends: SAP, Odoo & Oracle
VVF-Customer-Presentation2025-Ver1.9.pptx
Softaken Excel to vCard Converter Software.pdf
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
Introduction Database Management System for Course Database
Nekopoi APK 2025 free lastest update

An ontology for reasoning on body-based gestures

  • 1. The 11th ACM SIGCHI Symposium on Engineering Interactive Computing Systems 18-21 June, 2019 - Valencia, Spain An Ontology for Reasoning on Body-based Gestures Mehdi Ousmer, Jean Vanderdonckt, Université catholique de Louvain, Belgium Sabin Buraga, Alexandru Ioan Cuza University of Iasi, Romania Results The Kinect sensor is widely used for different users, environments, and applications. The purpose of this work is to design tools to structure gestures for automated reasoning of body-based gestures, acquired by Kinect sensor. Thus, we come up with a sensor-independent ontology. The ontology is based on different parts with intrinsic and extrinsic properties. To establish this proposal, a gesture elicitation study was conducted which fed the ontology with 456 elicited gestures. The ontology 0 5 10 15 20 25 7654321 Device frequency of usage Computer Smartphone Tablet Game Kinect Feeding the ontology Participants: •Twenty-four participants (11 Females, 13 Males) •Participants’ average age is 34.5 (from 23 to 53 Years) •Voluntary participants with different occupations •Most of them were not familiar with the Kinect sensor Apparatus: •The experiment was in a controlled environment ( usability laboratory) •Showing referent to the participant • Time how long it takes to think •Gesture recorded by a camera and a Kinect sensor •Ask the participant to rate the goodness-of-fit References: [1] Radu-Daniel Vatavu and Jacob O. Wobbrock. 2015. Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit. In Proc. of CHI ’15. 1325–1334. [2] Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-defined Gestures for Surface Computing. In Proc. of CHI ’09. 1083–1092. Experiment: •Gesture elicitation study based on original methodology [1][2] •Nineteen referents from IoT actions •Referents are divided into three parts (Unary/Binary/Linear) 11.54 11.17 17.14 7.90 8.13 8.31 8.36 9.27 10.34 8.40 6.74 10.63 10.83 7.63 9.99 17.31 11.96 5.85 7.45 0.00 2.00 4.00 6.00 8.00 10.00 12.00 14.00 16.00 18.00 20.00 Turn TV On Turn TV Off Start Player Increase Volume Decrease Volume Go to Next Item in a List Go to Previous Item in a List Turn Air Conditioning On Turn Air Conditioning OFF Turn Light On Turn Light Off Brighten Lights Dim Lights Turn Heating System On Turn Heating System Off Turn Alarm On Turn Alarm Off Answer Phone Call End Phone Call Thinking times (sec.) 7.54 7.21 7.25 8.04 7.50 7.71 7.46 6.92 6.79 7.58 7.75 6.83 7.04 7.75 7.25 6.63 6.88 7.75 8.08 0.002.004.006.008.00 Goodness of Fit (1-10) Agreement rate: •Agreement rate computed by AGaTE[1] •In average magnitude, agreement rates are Medium •Most referents belong to the medium range •Similar to other rates in literature Classification: •Half of the gestures were made in a medium range •Dominant hand was most used •The study resulted in 53 unique gestures go in 23 categories of gestures •No particular correlation between thinking time for pairs of complementary referents •The value of goodness of fit are above the average values, which indicates that participants were satisfied by the gestures they chose User satisfaction: •Subjective user satisfaction using the IBM PSSUQ questionnaire •Participants were subjectively satisfied with body-based gestures •All values of measures are superior to 5 on a scale of 1 to 7 Conclusion: •A base to describe body-based gestures for reasoning tasks (e.g. Beyond querying ) •Potential benefits from the ontology (Reusability, Coherence, Extensibility, …) •The RDF defines 3 main classes: •User: Information about user and anatomical body parts and joints. •Sensor: Raw data provided by a device. •Detected gestures and poses: Includes Gesture, GestureSegment, HandState, and Pose. •Representation of the body-based gesture in the context of use •The ontology based on user, sensor and physical environment •Expressed in OWL (Ontology Web Language), a W3C standard according to the triples <subject, predicate, object> •Joints: Objects that could be compared respecting their position in a Cartesian spaces • Segment: Merged joints which specify body pose •Pose: the basic part of a gesture which can be manipulated using logical constructs (And/Or/Not)