Training Neural Networks Using Predictor-Corrector Gradient Descent.
Amy NeskyQuentin F. StoutPublished in: ICANN (3) (2018)
Keyphrases
- neural network
- training process
- training algorithm
- feedforward neural networks
- back propagation
- backpropagation algorithm
- cost function
- artificial neural networks
- error back propagation
- pattern recognition
- feed forward neural networks
- learning rules
- multi layer perceptron
- neural network training
- recurrent networks
- conjugate gradient
- fuzzy logic
- multilayer perceptron
- neural network model
- objective function
- neural network structure
- neural nets
- recurrent neural networks
- training phase
- feed forward
- online learning
- radial basis function network
- training set
- training speed
- multilayer neural network
- training patterns
- stochastic gradient descent
- cellular neural networks
- network architecture
- serious games
- feature vectors
- machine learning