mu-Forcing: Training Variational Recurrent Autoencoders for Text Generation.
Dayiheng LiuYang XueFeng HeYuanyuan ChenJiancheng LvPublished in: CoRR (2019)
Keyphrases
- text generation
- natural language generation
- denoising
- feedforward neural networks
- image segmentation
- training process
- recurrent networks
- natural language
- development environment
- recurrent neural networks
- training phase
- learning algorithm
- training algorithm
- domain independent
- artificial intelligence
- information retrieval
- training examples
- hidden markov models
- multi unit combinatorial auctions