Wasserstein Autoencoders with Mixture of Gaussian Priors for Stylized Text Generation.
Amirpasha GhabussiLili MouOlga VechtomovaPublished in: TDS (2021)
Keyphrases
- text generation
- heavy tailed
- natural language generation
- gaussian distribution
- mixture distribution
- gaussian mixture
- denoising
- gaussian densities
- gaussian model
- mixture model
- gaussian mixture model
- natural language
- generalized em algorithm
- mixture distributions
- maximum likelihood
- multivariate gaussian
- gaussian density
- mixture of gaussians
- normal distribution
- pointwise
- expectation maximization
- bayesian framework
- generalized gaussian
- gaussian noise
- dirichlet process
- covariance matrices
- theorem prover
- prior knowledge
- image processing
- image prior
- neural network
- restricted boltzmann machine