Bayesian estimation of the Kullback-Leibler divergence for categorical sytems using mixtures of Dirichlet priors.
Francesco CamagliaIlya NemenmanThierry MoraAleksandra M. WalczakPublished in: CoRR (2023)
Keyphrases
- kullback leibler divergence
- bayesian estimation
- mixture model
- probability density function
- posterior distribution
- expectation maximization
- bayesian framework
- prior distribution
- em algorithm
- mutual information
- information theoretic
- gaussian mixture
- information theory
- distance measure
- generative model
- maximum a posteriori
- gaussian mixture model
- posterior probability
- density estimation
- marginal distributions
- generalized gaussian
- model selection
- probabilistic model
- hyperparameters
- probability distribution
- prior information
- parameter estimation
- prior knowledge
- unsupervised learning
- markov random field
- gaussian distribution
- machine learning
- sample size
- computer vision
- feature selection
- pairwise
- random sampling
- maximum likelihood