An Evaluation of an Augmented Reality Multimodal Interface Using Speech and Paddle Gestures.
Sylvia IrawatiScott A. GreenMark BillinghurstAndreas DünserHeedong KoPublished in: ICAT (2006)
Keyphrases
- augmented reality
- multimodal interfaces
- human computer interaction
- gesture recognition
- hand gestures
- user interface
- virtual objects
- multimodal interaction
- virtual reality
- human interaction
- head mounted display
- eye tracking
- learning mechanism
- hidden markov models
- human centered
- daily life
- user centered
- expert systems
- learning environment
- three dimensional
- live video
- camera tracking
- machine learning