On the Importance of the Kullback-Leibler Divergence Term in Variational Autoencoders for Text Generation.
Victor ProkhorovEhsan ShareghiYingzhen LiMohammad Taher PilehvarNigel CollierPublished in: NGT@EMNLP-IJCNLP (2019)
Keyphrases
- kullback leibler divergence
- text generation
- natural language generation
- mutual information
- probability density function
- kl divergence
- information theoretic
- information theory
- distance measure
- natural language
- denoising
- marginal distributions
- theorem prover
- image segmentation
- information retrieval
- probability distribution
- data mining
- image analysis
- diffusion tensor
- feature selection