Information-Theoretic Bounds on Transfer Generalization Gap Based on Jensen-Shannon Divergence.
Sharu Theresa JoseOsvaldo SimeonePublished in: CoRR (2020)
Keyphrases
- jensen shannon divergence
- information theoretic
- information theory
- mutual information
- jensen shannon
- theoretic framework
- information theoretic measures
- information bottleneck
- transfer learning
- log likelihood
- kullback leibler divergence
- entropy measure
- learning algorithm
- pattern recognition
- kl divergence
- selection criterion
- gaussian mixture model
- training set
- machine learning