Using speech to identify gesture pen strokes in collaborative, multimodal device descriptions.
James HeroldThomas F. StahovichPublished in: Artif. Intell. Eng. Des. Anal. Manuf. (2011)
Keyphrases
- multimodal interfaces
- sketch recognition
- gesture recognition
- audio visual
- multi stream
- human computer interaction
- input device
- multimodal interaction
- hidden markov models
- multi modal
- hand movements
- automatic speech recognition
- endpoint detection
- gaze tracking
- future trends
- collaborative environment
- hand gestures
- eye tracking
- speech recognition