Homeomorphic-Invariance of EM: Non-Asymptotic Convergence in KL Divergence for Exponential Families via Mirror Descent (Extended Abstract).
Frederik KunstnerRaunak KumarMark SchmidtPublished in: IJCAI (2022)
Keyphrases
- extended abstract
- kl divergence
- exponential family
- maximum likelihood
- mixture model
- density estimation
- expectation maximization
- em algorithm
- log likelihood
- graphical models
- statistical models
- closed form
- probabilistic model
- gaussian mixture model
- hidden variables
- missing values
- parameter estimation
- generative model
- probability density function
- unsupervised learning
- gaussian mixture
- variational methods
- gaussian distribution
- statistical model
- order statistics
- maximum a posteriori
- k means
- reinforcement learning
- image segmentation
- high dimensional data
- model selection
- language model
- prior knowledge