Homeomorphic-Invariance of EM: Non-Asymptotic Convergence in KL Divergence for Exponential Families via Mirror Descent.
Frederik KunstnerRaunak KumarMark SchmidtPublished in: AISTATS (2021)
Keyphrases
- exponential family
- kl divergence
- maximum likelihood
- mixture model
- density estimation
- expectation maximization
- em algorithm
- log likelihood
- gaussian mixture model
- graphical models
- missing values
- closed form
- generative model
- statistical models
- parameter estimation
- hidden variables
- gaussian distribution
- probabilistic model
- probability density function
- gaussian mixture
- variational methods
- maximum a posteriori
- unsupervised learning
- order statistics
- statistical model
- learning algorithm
- markov chain monte carlo
- image segmentation