Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models.
Tianwen WeiBo ZhuLiang ZhaoCheng ChengBiye LiWeiwei LüPeng ChengJianhao ZhangXiaoyu ZhangLiang ZengXiaokun WangYutuan MaRui HuShuicheng YanHan FangYahui ZhouPublished in: CoRR (2024)
Keyphrases
- language model
- mixture model
- language modeling
- n gram
- probabilistic model
- document retrieval
- test collection
- query expansion
- retrieval model
- information retrieval
- language modelling
- expert finding
- speech recognition
- statistical language models
- context sensitive
- smoothing methods
- passage retrieval
- translation model
- query terms
- ad hoc information retrieval
- deep learning
- language models for information retrieval
- document ranking
- training set
- bayesian networks
- spoken term detection
- language model for information retrieval
- machine learning
- statistical language modeling