EmbRace: Accelerating Sparse Communication for Distributed Training of NLP Neural Networks.
Shengwei LiZhiquan LaiDongsheng LiXiangyu YeYabo DuanPublished in: CoRR (2021)
Keyphrases
- neural network
- training process
- communication overhead
- communication cost
- training algorithm
- computer networks
- distributed control
- natural language processing
- fully distributed
- feedforward neural networks
- back propagation
- multi layer perceptron
- distributed computation
- distributed systems
- feed forward neural networks
- neural network training
- backpropagation algorithm
- high dimensional
- text mining
- training examples
- pattern recognition
- multi party
- neural network model
- global knowledge
- communication networks
- multi agent
- single point of failure
- avoid overfitting
- fuzzy logic
- hearing impaired
- multilayer neural network
- natural language
- error back propagation
- open systems
- exchange information
- artificial neural networks
- peer to peer
- sparse representation
- mobile agents
- distributed environment
- neural nets
- fault tolerant
- neural network structure
- multilayer perceptron
- language processing
- question answering
- training set
- multi agent systems
- training data
- learning algorithm