Parameter-Efficient Mixture-of-Experts Architecture for Pre-trained Language Models.
Ze-Feng GaoPeiyu LiuWayne Xin ZhaoZhong-Yi LuJi-Rong WenPublished in: CoRR (2022)
Keyphrases
- language model
- mixture model
- language modeling
- pre trained
- probabilistic model
- speech recognition
- test collection
- information retrieval
- statistical language models
- retrieval model
- language modelling
- n gram
- document retrieval
- query expansion
- document ranking
- relevance model
- dimensionality reduction
- expectation maximization