DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models.
Damai DaiChengqi DengChenggang ZhaoR. X. XuHuazuo GaoDeli ChenJiashi LiWangding ZengXingkai YuY. WuZhenda XieY. K. LiPanpan HuangFuli LuoChong RuanZhifang SuiWenfeng LiangPublished in: ACL (1) (2024)
Keyphrases
- language model
- mixture model
- language modeling
- domain experts
- expert finding
- n gram
- probabilistic model
- information retrieval
- document retrieval
- speech recognition
- language modelling
- retrieval model
- test collection
- query terms
- context sensitive
- query expansion
- relevance model
- pseudo relevance feedback
- term dependencies
- ad hoc information retrieval
- dirichlet distribution
- translation model
- smoothing methods
- passage retrieval
- document length
- word error rate
- statistical language models
- vector space model