Block layer decomposition schemes for training deep neural networks.
Laura PalagiRuggiero SecciaPublished in: J. Glob. Optim. (2020)
Keyphrases
- neural network
- multi layer
- training process
- feed forward neural networks
- training algorithm
- multiple layers
- feedforward neural networks
- multi layer perceptron
- error back propagation
- deep architectures
- training set
- neural network training
- artificial neural networks
- genetic algorithm
- single layer
- training phase
- back propagation
- output layer
- pattern recognition
- neural network model
- feed forward
- backpropagation algorithm
- neural nets
- test set
- recurrent neural networks
- radial basis function network
- decomposition method
- fuzzy logic
- application layer
- associative memory
- hidden nodes
- activation function
- multiresolution
- online learning
- multiscale
- fuzzy systems