Mixture Models, Bayes Fisher Information, and Divergence Measures.
Majid AsadiNader EbrahimiOmid KharazmiEhsan S. SoofiPublished in: IEEE Trans. Inf. Theory (2019)
Keyphrases
- mixture model
- fisher information
- information geometry
- probability density function
- exponential family
- gaussian mixture model
- em algorithm
- kl divergence
- riemannian metric
- probabilistic model
- density estimation
- expectation maximization
- generative model
- model selection
- unsupervised learning
- language model
- noise level
- gaussian mixture
- maximum likelihood
- euclidean space
- positive definite
- kullback leibler divergence
- log likelihood
- graphical models
- pattern recognition
- machine learning
- estimation error
- gaussian distribution
- support vector
- information theoretic
- bayesian networks
- support vector machine
- training data