Rewire-then-Probe: A Contrastive Recipe for Probing Biomedical Knowledge of Pre-trained Language Models.
Zaiqiao MengFangyu LiuEhsan ShareghiYixuan SuCharlotte CollinsNigel CollierPublished in: CoRR (2021)
Keyphrases
- language model
- pre trained
- language modeling
- n gram
- probabilistic model
- document retrieval
- speech recognition
- information retrieval
- retrieval model
- language modelling
- statistical language models
- learning process
- query expansion
- data mining
- relevance model
- training data
- test collection
- text mining
- prior knowledge
- document ranking
- bayesian networks
- multi modal
- neural network
- language models for information retrieval