Alleviating Imbalance in Synchronous Distributed Training of Deep Neural Networks.
Haiyang LinMingyu YanDuo WangWenming LiXiaochun YeZhimin TangDongrui FanPublished in: ISPA/BDCloud/SocialCom/SustainCom (2021)
Keyphrases
- neural network
- training process
- training algorithm
- feed forward neural networks
- feedforward neural networks
- multi layer
- back propagation
- distributed environment
- error back propagation
- cooperative
- artificial neural networks
- backpropagation algorithm
- pattern recognition
- multi layer perceptron
- multilayer neural network
- deep architectures
- feed forward
- training set
- distributed data
- data sets
- neural network training
- multi agent
- training phase
- peer to peer
- computer networks
- training examples
- test set
- genetic algorithm
- neural network structure
- recurrent networks
- training patterns
- fault tolerant
- activation function
- self organizing maps
- multilayer perceptron
- online learning
- semi supervised
- recurrent neural networks
- training data
- hidden layer
- decision trees
- sufficient conditions
- communication cost