Affective handshake with a humanoid robot: How do participants perceive and combine its facial and haptic expressions?
Mohamed Yacine TsalamlalJean-Claude MartinMehdi AmmiAdriana TapusMichel-Ange AmorimPublished in: ACII (2015)
Keyphrases
- humanoid robot
- facial expressions
- motion planning
- biologically inspired
- emotion recognition
- multi modal
- face recognition
- emotional state
- human robot interaction
- face images
- human computer interaction
- facial images
- motion capture
- human robot
- facial features
- human motion
- virtual environment
- joint space
- imitation learning
- motor skills
- video sequences
- body movements
- manipulation tasks
- fully autonomous
- computer vision
- robot arm
- pedagogical agents
- facial actions
- reinforcement learning
- natural language