Entropy Converges Between Dialogue Participants: Explanations from an Information-Theoretic Perspective.
Yang XuDavid ReitterPublished in: ACL (1) (2016)
Keyphrases
- information theoretic
- information theory
- mutual information
- theoretic framework
- turn taking
- information bottleneck
- entropy measure
- log likelihood
- jensen shannon divergence
- relative entropy
- multi modality
- information theoretic measures
- minimum description length
- kullback leibler divergence
- kl divergence
- mdl principle
- computational learning theory
- medical images
- bayesian networks
- similarity measure
- machine learning