Expressing reactive emotion based on multimodal emotion recognition for natural conversation in human-robot interaction.
Yuanchao LiCarlos Toshinori IshiKoji InoueShizuka NakamuraTatsuya KawaharaPublished in: Adv. Robotics (2019)
Keyphrases
- emotion recognition
- human robot interaction
- audio visual
- pointing gestures
- natural interaction
- emotional speech
- multi modal
- gesture recognition
- facial expressions
- human robot
- human computer interaction
- humanoid robot
- emotion classification
- affective computing
- sentiment analysis
- robot programming
- multimodal interfaces
- emotional state
- information fusion
- visual information
- physiological signals
- service robots
- artificial intelligence
- facial images
- neural network
- face recognition
- multimodal interaction
- low level
- visual data