Dispersed Exponential Family Mixture VAEs for Interpretable Text Generation.
Wenxian ShiHao ZhouNing MiaoLei LiPublished in: ICML (2020)
Keyphrases
- exponential family
- text generation
- natural language generation
- log likelihood
- mixture model
- maximum likelihood
- graphical models
- closed form
- density estimation
- statistical models
- missing values
- natural language
- hidden variables
- probability density function
- theorem prover
- variational methods
- order statistics
- probabilistic model
- em algorithm
- approximate inference
- markov chain monte carlo
- dialogue system
- multiscale
- natural language processing
- median filter
- image segmentation
- monte carlo
- statistical model
- missing data
- particle filter