Distilling Word Meaning in Context from Pre-trained Language Models.
Yuki AraseTomoyuki KajiwaraPublished in: EMNLP (Findings) (2021)
Keyphrases
- language model
- pre trained
- language modeling
- word meaning
- n gram
- probabilistic model
- speech recognition
- document retrieval
- query expansion
- retrieval model
- information retrieval
- test collection
- relevance model
- text classification
- natural language
- vector space model
- out of vocabulary
- training data
- contextual information
- translation model
- learning algorithm