FaceSight: Enabling Hand-to-Face Gesture Interaction on AR Glasses with a Downward-Facing Camera Vision.
Yueting WengChun YuYingtian ShiYuhang ZhaoYukang YanYuanchun ShiPublished in: CHI (2021)
Keyphrases
- input device
- augmented reality
- human computer interaction
- hand gestures
- vision system
- multimodal interfaces
- stereo camera
- eye contact
- gesture recognition
- real time
- gaze tracking
- hand motion
- hand gesture recognition
- computer vision
- head movements
- virtual objects
- facial expressions
- field of view
- hand held
- eye gaze
- human hand
- hidden markov models
- hand movements
- pose variations
- visual input
- human faces
- real scenes
- user interface
- face images
- hand tracking
- camera calibration
- video camera
- face tracking
- sign language
- real environment
- visual feedback
- multimodal interaction
- structure from motion
- position and orientation
- multiple cameras
- facial features
- pointing gestures