Sign in

ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training.

Weizhen QiYu YanYeyun GongDayiheng LiuNan DuanJiusheng ChenRuofei ZhangMing Zhou
Published in: EMNLP (Findings) (2020)
Keyphrases
  • n gram
  • predicting future
  • viterbi algorithm
  • language model
  • training set
  • neural network
  • text classification
  • web documents
  • language modeling
  • variable length
  • language independent