Fine-Tuning Pre-trained Language Model with Weak Supervision: A Contrastive-Regularized Self-Training Approach.
Yue YuSimiao ZuoHaoming JiangWendi RenTuo ZhaoChao ZhangPublished in: CoRR (2020)
Keyphrases
- k means
- fine tuning
- language model
- pre trained
- language modeling
- training data
- n gram
- training set
- training examples
- information retrieval
- probabilistic model
- document retrieval
- test collection
- speech recognition
- semi supervised learning
- context sensitive
- query expansion
- retrieval model
- mixture model
- control signals
- cost sensitive
- ad hoc information retrieval
- smoothing methods
- least squares
- semi supervised
- non stationary
- face recognition
- machine learning
- neural network