FFT-based Gradient Sparsification for the Distributed Training of Deep Neural Networks.
Linnan WangWei WuJunyu ZhangHang LiuGeorge BosilcaMaurice HerlihyRodrigo FonsecaPublished in: HPDC (2020)
Keyphrases
- communication cost
- neural network
- training process
- training algorithm
- multi layer perceptron
- feedforward neural networks
- feed forward neural networks
- pattern recognition
- training set
- edge detection
- backpropagation algorithm
- deep architectures
- neural network training
- back propagation
- artificial neural networks
- fuzzy logic
- training examples
- training samples
- least squares
- radial basis function network
- recurrent networks
- recurrent neural networks
- data sets
- frequency domain
- online learning
- multi layer
- neural nets
- multilayer perceptron
- signal processing
- cooperative
- neural network structure
- genetic algorithm
- error back propagation
- multilayer neural network