Prune Once for All: Sparse Pre-Trained Language Models.
Ofir ZafrirAriel LareyGuy BoudoukhHaihao ShenMoshe WasserblatPublished in: CoRR (2021)
Keyphrases
- language model
- pre trained
- language modeling
- training data
- n gram
- probabilistic model
- information retrieval
- document retrieval
- retrieval model
- language modelling
- speech recognition
- query expansion
- training examples
- statistical language models
- test collection
- control signals
- neural network
- smoothing methods
- language models for information retrieval
- document ranking
- high dimensional
- relevance model
- sparse representation
- decision trees
- learning algorithm
- statistical language modeling