Login / Signup
SE-MoE: A Scalable and Efficient Mixture-of-Experts Distributed Training and Inference System.
Liang Shen
Zhihua Wu
Weibao Gong
Hongxiang Hao
Yangfan Bai
HuaChao Wu
Xinxuan Wu
Haoyi Xiong
Dianhai Yu
Yanjun Ma
Published in:
CoRR (2022)
Keyphrases
</>
lightweight
scalable distributed
training set
high scalability
online learning
highly scalable
computationally expensive
training examples
distributed systems
supervised learning
cooperative
distributed environment
training algorithm
multi agent
database
domain knowledge
training phase
data sets
real time