Entropy and the Kullback-Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation.
Marco ScutariPublished in: CoRR (2023)
Keyphrases
- kullback leibler divergence
- efficient implementation
- bayesian networks
- computational complexity
- information theory
- mutual information
- information theoretic
- probability density function
- kl divergence
- distance measure
- probability distribution
- marginal distributions
- random variables
- conditional probabilities
- bit rate
- diffusion tensor
- model selection
- image registration
- feature extraction