A discrete complement of Lyapunov's inequality and its information theoretic consequences.
James MelbourneGerardo Palafox-CastilloPublished in: CoRR (2021)
Keyphrases
- information theoretic
- information theory
- mutual information
- theoretic framework
- jensen shannon divergence
- entropy measure
- kullback leibler divergence
- information theoretic measures
- log likelihood
- multi modality
- information bottleneck
- kl divergence
- neural network
- distributional clustering
- minimum description length
- relative entropy
- pattern recognition
- image processing
- control law
- multi modal
- image registration