MOCHA: A Multi-Task Training Approach for Coherent Text Generation from Cognitive Perspective.
Zhe HuHou Pong ChanLifu HuangPublished in: EMNLP (2022)
Keyphrases
- multi task
- text generation
- multi task learning
- learning tasks
- natural language generation
- multitask learning
- multiple tasks
- gaussian processes
- multi class
- sparse learning
- learning problems
- transfer learning
- supervised learning
- feature selection
- information gain
- image classification
- training set
- reinforcement learning
- information retrieval
- training examples
- theoretical analysis
- theorem prover
- natural language