Information Theoretic lower bounds on negative log likelihood.
Luis Alfonso Lastras-MontañoPublished in: ICLR (Poster) (2019)
Keyphrases
- information theoretic
- lower bound
- mutual information
- upper bound
- information theory
- theoretic framework
- worst case
- information bottleneck
- objective function
- information theoretic measures
- jensen shannon divergence
- minimum description length
- log likelihood
- relative entropy
- optimal solution
- entropy measure
- learning algorithm
- kl divergence
- kullback leibler divergence
- vc dimension
- machine learning
- multi modality
- computational learning theory
- sample complexity
- medical images
- image registration
- computer vision