MPC-BERT: A Pre-Trained Language Model for Multi-Party Conversation Understanding.
Jia-Chen GuChongyang TaoZhen-Hua LingCan XuXiubo GengDaxin JiangPublished in: ACL/IJCNLP (1) (2021)
Keyphrases
- language model
- multi party
- pre trained
- language modeling
- privacy preserving
- n gram
- information retrieval
- probabilistic model
- speech recognition
- retrieval model
- query expansion
- mixture model
- test collection
- document retrieval
- description language
- query terms
- context sensitive
- training data
- neural network
- translation model
- control signals
- ad hoc information retrieval
- training examples
- reinforcement learning
- smoothing methods