Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization.
Pengcheng HeBaolin PengLiyang LuSong WangJie MeiYang LiuRuochen XuHany Hassan AwadallaYu ShiChenguang ZhuWayne XiongMichael ZengJianfeng GaoXuedong HuangPublished in: CoRR (2022)
Keyphrases
- language model
- pre trained
- language modeling
- n gram
- probabilistic model
- training data
- document retrieval
- retrieval model
- information retrieval
- query expansion
- speech recognition
- context sensitive
- training examples
- mixture model
- test collection
- smoothing methods
- ad hoc information retrieval
- translation model
- relevance model
- control signals
- co occurrence
- feature vectors
- high dimensional
- learning algorithm
- machine learning