Active Learning on Pre-trained Language Model with Task-Independent Triplet Loss.
Seungmin SeoDonghyun KimYoubin AhnKyong-Ho LeePublished in: AAAI (2022)
Keyphrases
- language model
- pre trained
- active learning
- training examples
- language modeling
- n gram
- training data
- document retrieval
- retrieval model
- probabilistic model
- information retrieval
- test collection
- speech recognition
- query expansion
- context sensitive
- learning algorithm
- ad hoc information retrieval
- mixture model
- labeled data
- translation model
- smoothing methods
- relevance model
- machine learning
- relevance feedback
- data sets
- unlabeled data
- semi supervised learning
- supervised learning
- small number