How do speakers coordinate planning and articulation? Evidence from gaze-speech lags.
Chiara GambiMatthew W. CrockerPublished in: CogSci (2017)
Keyphrases
- speech recognition
- eye tracking
- automatic speech recognition
- pattern recognition
- speaker dependent
- empirical evidence
- eye movements
- planning problems
- gaze control
- visual search
- visual attention
- speaker independent
- recognition engine
- spoken language
- endpoint detection
- decision support
- neural network
- stochastic domains
- hand movements
- speech synthesis
- human communication
- speaker recognition
- eye tracking data
- planning systems
- dialogue system
- ai planning
- hidden markov models
- belief functions
- speech signal
- language model
- heuristic search