A multimodal gesture recognition dataset for desktop human-computer interaction.
Qi WangFengchao ZhuGuangming ZhuLiang ZhangNing LiEryang GaoPublished in: CoRR (2024)
Keyphrases
- gesture recognition
- human computer interaction
- user interface
- multimodal interfaces
- hand gestures
- pointing gestures
- hand gesture recognition
- sign language
- human computer interface
- hand tracking
- human computer
- sign language recognition
- eye tracking
- multi modal
- human machine interface
- human robot interaction
- interface design
- augmented reality
- motion capture
- stereo camera
- human factors
- facial expression recognition
- input device
- software engineering
- hand motion
- ubiquitous computing and ambient intelligence
- multimodal interaction
- interaction design
- hidden markov models
- data model