MPC-BERT: A Pre-Trained Language Model for Multi-Party Conversation Understanding.
Jia-Chen GuChongyang TaoZhen-Hua LingCan XuXiubo GengDaxin JiangPublished in: CoRR (2021)
Keyphrases
- language model
- multi party
- pre trained
- language modeling
- privacy preserving
- document retrieval
- n gram
- probabilistic model
- speech recognition
- retrieval model
- ad hoc information retrieval
- training data
- information retrieval
- mixture model
- test collection
- context sensitive
- smoothing methods
- query expansion
- query terms
- translation model
- description language
- training examples
- learning algorithm
- bayesian networks
- neural network