Information-Theoretic Distillation for Reference-less Summarization.
Jaehun JungXiming LuLiwei JiangFaeze BrahmanPeter WestPang Wei KohYejin ChoiPublished in: CoRR (2024)
Keyphrases
- information theoretic
- mutual information
- information theory
- theoretic framework
- information bottleneck
- kullback leibler divergence
- log likelihood
- information theoretic measures
- entropy measure
- multi modality
- minimum description length
- jensen shannon divergence
- kl divergence
- pattern recognition
- reference frame
- data mining
- support vector
- bayesian networks
- computational learning theory
- relative entropy
- similarity measure
- feature selection
- computer vision