Layer-Parallel Training of Deep Residual Neural Networks.
Stefanie GüntherLars RuthottoJacob B. SchroderEric C. CyrNicolas R. GaugerPublished in: SIAM J. Math. Data Sci. (2020)
Keyphrases
- neural network
- multi layer
- training process
- training algorithm
- error back propagation
- feed forward neural networks
- feedforward neural networks
- multiple layers
- pattern recognition
- backpropagation algorithm
- artificial neural networks
- training set
- neural network training
- training examples
- recurrent neural networks
- neural nets
- multi layer perceptron
- training phase
- deep architectures
- parallel processing
- fuzzy logic
- back propagation
- active learning
- parallel implementation
- activation function
- recurrent networks
- sufficient conditions
- online learning
- genetic algorithm
- multilayer perceptron
- supervised learning
- output layer
- parallel programming
- training samples
- massively parallel
- test set
- parallel algorithm
- hidden layer
- feed forward