Contents-aware gesture interaction using wearable motion sensor.
Tsukasa IkeToshiaki NakasuYasunobu YamauchiPublished in: ISWC Adjunct (2014)
Keyphrases
- gesture recognition
- human computer interaction
- hand gestures
- hand motion
- sensor networks
- human robot interaction
- inertial sensors
- accelerometer data
- motion capture
- motion analysis
- motion estimation
- motion patterns
- image sequences
- sensor data
- multimodal interfaces
- motion tracking
- optical flow
- input device
- activity recognition
- wearable sensors
- wearable devices
- augmented reality
- motion model
- motion sequences
- multimedia
- body movements
- space time
- hidden markov models
- visual feedback
- human hand
- humanoid robot
- moving objects
- real time
- human motion
- sign language
- human movement
- metadata
- gaze control
- spatio temporal
- wireless sensor networks
- user interaction
- robot motion
- camera motion
- eye tracking
- multi sensor
- visual input
- joint angles
- motion planning
- sensor fusion
- position and orientation
- eye tracker