Evaluating data-driven co-speech gestures of embodied conversational agents through real-time interaction.
Yuan HeAndré PereiraTaras KucherenkoPublished in: IVA (2022)
Keyphrases
- data driven
- embodied conversational agents
- real time
- virtual characters
- human communication
- human robot interaction
- hidden markov models
- speech recognition
- hand movements
- human computer interaction
- hand postures
- vision system
- gesture recognition
- image sequences
- gaze control
- emotional state
- user interface
- automatic speech recognition
- multimodal interfaces
- sign language
- hand gestures
- social interaction
- computational model
- face to face interactions