An information-theoretic account of availability effects in language production.
Richard FutrellPublished in: CogSci (2023)
Keyphrases
- information theoretic
- mutual information
- information theory
- theoretic framework
- jensen shannon divergence
- information bottleneck
- log likelihood
- information theoretic measures
- multi modality
- minimum description length
- kullback leibler divergence
- relative entropy
- computational learning theory
- entropy measure
- jensen shannon
- kl divergence
- machine learning