Sign in
ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training.
Weizhen Qi
Yu Yan
Yeyun Gong
Dayiheng Liu
Nan Duan
Jiusheng Chen
Ruofei Zhang
Ming Zhou
Published in:
EMNLP (Findings) (2020)
Keyphrases
</>
n gram
predicting future
viterbi algorithm
language model
training set
neural network
text classification
web documents
language modeling
variable length
language independent