Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence.
Masato UchidaPublished in: SCIS&ISIS (2014)
Keyphrases
- parameter estimation
- kullback leibler divergence
- generalized gaussian
- em algorithm
- probability density function
- mutual information
- maximum likelihood
- expectation maximization
- information theoretic
- least squares
- information theory
- posterior distribution
- markov random field
- model selection
- unsupervised learning
- statistical models
- distance measure
- random fields
- marginal distributions
- maximum a posteriori
- mixture model
- maximum likelihood estimation
- supervised learning
- semi supervised
- probability distribution
- prior model
- image processing
- probabilistic model
- hyperparameters
- gaussian mixture model