Compression without a common prior: an information-theoretic justification for ambiguity in language.
Brendan JubaAdam Tauman KalaiSanjeev KhannaMadhu SudanPublished in: ICS (2011)
Keyphrases
- information theoretic
- mutual information
- information theory
- theoretic framework
- jensen shannon divergence
- entropy measure
- prior knowledge
- information theoretic measures
- log likelihood
- information bottleneck
- minimum description length
- relative entropy
- prior information
- kullback leibler divergence
- bregman divergences
- kl divergence
- multi modality
- computational learning theory
- text categorization
- medical images
- image registration
- pattern recognition