A Speech-Driven Pupil Response System with Affective Expression Using Hemispherical Displays.
Yoshihiro SejimaShoichi EgawaRyosuke MaedaYoichiro SatoTomio WatanabePublished in: RO-MAN (2018)
Keyphrases
- emotion recognition
- pupil size
- speech signal
- facial animation
- high resolution
- speech recognition
- infrared
- data driven
- audio visual
- emotional state
- visual stimuli
- affect sensing
- eye tracking
- cognitive model
- text to speech
- real time
- affect recognition
- affective computing
- hearing impaired
- speech synthesis
- broadcast news
- dialogue system
- automatic speech recognition
- pedagogical agents
- field of view
- human computer interaction