On the Importance of the Kullback-Leibler Divergence Term in Variational Autoencoders for Text Generation.
Victor ProkhorovEhsan ShareghiYingzhen LiMohammad Taher PilehvarNigel CollierPublished in: CoRR (2019)
Keyphrases
- kullback leibler divergence
- text generation
- natural language generation
- mutual information
- information theoretic
- probability density function
- information theory
- kl divergence
- distance measure
- natural language
- denoising
- theorem prover
- diffusion tensor
- image segmentation
- image registration
- probability distribution
- computer vision
- information retrieval
- human brain
- marginal distributions