Understanding Gesture and Speech Multimodal Interactions for Manipulation Tasks in Augmented Reality Using Unconstrained Elicitation.
Adam S. WilliamsFrancisco Raul OrtegaPublished in: Proc. ACM Hum. Comput. Interact. (2020)
Keyphrases
- augmented reality
- multimodal interfaces
- human computer interaction
- manipulation tasks
- user interface
- human activities
- gesture recognition
- virtual objects
- virtual reality
- human robot interaction
- robot navigation
- motion planning
- hand gestures
- real environment
- expert systems
- input device
- hidden markov models
- camera tracking
- head mounted display
- visual information
- virtual environment
- multi modal
- software engineering
- live video
- tangible interaction