On the entropy region of discrete and continuous random variables and network information theory.
Sormeh ShadbakhtBabak HassibiPublished in: ACSCC (2008)
Keyphrases
- information theory
- random variables
- continuous variables
- information theoretic
- graphical models
- probability distribution
- conditional entropy
- statistical learning
- jensen shannon divergence
- bayesian networks
- conditional independence
- joint distribution
- latent variables
- distribution function
- relative entropy
- conditionally independent
- probability density
- directed acyclic graph
- independent and identically distributed
- conditional distributions
- shannon entropy
- normal distribution
- stochastic optimization problems
- conditional probabilities
- network structure
- information geometry
- random vectors
- machine learning
- marginal distributions
- kullback leibler divergence
- mutual information