HetuMoE: An Efficient Trillion-scale Mixture-of-Expert Distributed Training System.
Xiaonan NiePinxue ZhaoXupeng MiaoTong ZhaoBin CuiPublished in: CoRR (2022)
Keyphrases
- distributed systems
- cooperative
- mixture model
- multi agent
- human experts
- distributed environment
- domain experts
- lightweight
- computer networks
- training algorithm
- peer to peer
- mobile agents
- em algorithm
- small scale
- serious games
- subject matter experts
- training process
- distributed data
- distributed network
- data sets
- test set
- online learning
- scale space
- sensor networks
- training set
- data streams
- information systems