ScaLA: Accelerating Adaptation of Pre-Trained Transformer-Based Language Models via Efficient Large-Batch Adversarial Noise.
Minjia ZhangUma-Naresh NiranjanYuxiong HePublished in: CoRR (2022)
Keyphrases
- language model
- language modeling
- pre trained
- probabilistic model
- n gram
- information retrieval
- document retrieval
- query expansion
- statistical language models
- speech recognition
- language modelling
- retrieval model
- test collection
- relevance model
- smoothing methods
- active learning
- language models for information retrieval
- data sets
- small number
- training data
- face recognition