Understanding Gesture and Speech Multimodal Interactions for Manipulation Tasks in Augmented Reality Using Unconstrained Elicitation.
Adam S. WilliamsFrancisco Raul OrtegaPublished in: CoRR (2020)
Keyphrases
- augmented reality
- multimodal interfaces
- human computer interaction
- manipulation tasks
- user interface
- human activities
- virtual objects
- virtual reality
- gesture recognition
- robot navigation
- real environment
- hidden markov models
- input device
- human robot interaction
- hand gestures
- motion planning
- eye tracking
- robotic systems
- service robots
- tangible interaction
- humanoid robot
- machine learning
- field of view
- robotic arm
- software engineering
- d objects
- camera tracking