A tight lower bound on the mutual information of a binary and an arbitrary finite random variable as a function of the variational distance.
Arno G. StefaniJohannes B. HuberChristophe JardinHeinrich StichtPublished in: AusCTW (2014)
Keyphrases
- lower bound
- random variables
- mutual information
- upper bound
- distribution function
- probability distribution
- graphical models
- unit interval
- image registration
- worst case
- bayesian networks
- similarity measure
- information theoretic
- objective function
- kullback leibler divergence
- feature selection
- binary valued
- optimal solution
- euclidean distance
- stochastic dominance
- latent variables
- joint distribution
- np hard
- distance measure
- hazard rate
- conditionally independent
- conditional distribution
- distance function
- marginal distributions
- variational methods
- latent variable models
- information gain
- joint probability distribution
- image segmentation
- conditional probabilities
- dynamic programming
- machine learning