From Clozing to Comprehending: Retrofitting Pre-trained Language Model to Pre-trained Machine Reader.
Weiwen XuXin LiWenxuan ZhangMeng ZhouLidong BingWai LamLuo SiPublished in: CoRR (2022)
Keyphrases
- pre trained
- language model
- language modeling
- training data
- n gram
- speech recognition
- probabilistic model
- information retrieval
- retrieval model
- query expansion
- training examples
- test collection
- document retrieval
- mixture model
- context sensitive
- control signals
- translation model
- data sets
- ad hoc information retrieval
- hidden markov models
- active learning
- bayesian networks
- dirichlet prior