Information-Theoretic Probing with Minimum Description Length.
Elena VoitaIvan TitovPublished in: EMNLP (1) (2020)
Keyphrases
- minimum description length
- information theoretic
- mutual information
- information theory
- theoretic framework
- information bottleneck
- computational learning theory
- information theoretic measures
- multi modality
- relative entropy
- kullback leibler divergence
- entropy measure
- jensen shannon divergence
- data mining
- log likelihood
- image processing
- computer vision
- statistical data
- kl divergence
- image registration
- image analysis
- learning algorithm