Repository logo
 

Intentional microgesture recognition for extended human-computer interaction

Abstract

As extended reality becomes more ubiquitous, people will more frequently interact with computer systems using gestures instead of peripheral devices. However, previous works have shown that using traditional gestures (pointing, swiping, etc.) in mid-air causes fatigue, rendering them largely unsuitable for long-term use. Some of the same researchers have promoted "microgestures"---smaller gestures requiring less gross motion---as a solution, but to date there is no dataset of intentional microgestures available to train computer vision algorithms for use in downstream interactions with computer systems such as agents deployed on XR headsets. As a step toward addressing this challenge, I present a novel video dataset of microgestures, classification results from a variety of ML models showcasing the feasibility (and difficulty) of detecting these fine-grained movements, and discuss the challenges in developing robust recognition of microgestures for human-computer interaction.

Description

Rights Access

Subject

computer vision
machine learning
artificial intelligence
microgesture
human computer interaction

Citation

Associated Publications