AdaBatch: Adaptive Batch Sizes for Training Deep Neural Networks.
Aditya DevarakondaMaxim NaumovMichael GarlandPublished in: CoRR (2017)
Keyphrases
- neural network
- training process
- training algorithm
- feed forward neural networks
- back propagation
- pattern recognition
- neural network training
- multi layer perceptron
- deep architectures
- test set
- batch mode
- radial basis function network
- training set
- feedforward neural networks
- multi layer
- fuzzy logic
- backpropagation algorithm
- error back propagation
- artificial neural networks
- recurrent networks
- associative memory
- training speed
- training patterns
- quality prediction
- neural network structure
- learning capabilities
- training samples
- evolutionary algorithm
- hidden layer
- fuzzy systems
- multilayer perceptron
- feed forward
- self organizing maps
- supervised learning
- training examples