A Tight Lower Bound on the Mutual Information of a Binary and an Arbitrary Finite Random Variable in Dependence of the Variational Distance
A. G. StefaniJohannes B. HuberChristophe JardinHeinrich StichtPublished in: CoRR (2013)
Keyphrases
- lower bound
- random variables
- mutual information
- upper bound
- statistical dependence
- graphical models
- binary valued
- probability distribution
- kullback leibler divergence
- information theoretic
- image registration
- worst case
- similarity measure
- latent variables
- distribution function
- np hard
- objective function
- joint distribution
- marginal distributions
- distance measure
- stochastic dominance
- feature selection
- information gain
- free energy
- distance function
- conditional probabilities
- bayesian networks
- conditionally independent
- image segmentation
- probabilistic model
- continuous variables
- conditional distribution
- joint probability distribution
- euclidean distance
- optimal solution
- vc dimension
- sample complexity
- approximate inference
- hazard rate
- fuzzy random variables