Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization.
Pengcheng HeBaolin PengSong WangYang LiuRuochen XuHany HassanYu ShiChenguang ZhuWayne XiongMichael ZengJianfeng GaoXuedong HuangPublished in: ACL (1) (2023)
Keyphrases
- language model
- pre trained
- language modeling
- training data
- document retrieval
- probabilistic model
- n gram
- speech recognition
- information retrieval
- training examples
- mixture model
- test collection
- retrieval model
- query expansion
- control signals
- context sensitive
- ad hoc information retrieval
- data sets
- smoothing methods
- translation model
- relevance model
- prior knowledge
- dirichlet prior
- word clouds
- feature space
- multimedia
- machine learning