An Information-Theoretic View of Generalization via Wasserstein Distance.
Hao WangMario DíazJosé Cândido Silveira Santos FilhoFlávio P. CalmonPublished in: ISIT (2019)
Keyphrases
- information theoretic
- information theory
- mutual information
- kullback leibler divergence
- theoretic framework
- kl divergence
- information theoretic measures
- information bottleneck
- log likelihood
- relative entropy
- jensen shannon divergence
- minimum description length
- entropy measure
- euclidean distance
- multi modality
- distance measure
- kullback leibler
- image registration
- pointwise
- computational learning theory
- multiple views
- image analysis
- distributional clustering