Rethinking the Usage of Batch Normalization and Dropout in the Training of Deep Neural Networks.
Guangyong ChenPengfei ChenYujun ShiChang-Yu HsiehBenben LiaoShengyu ZhangPublished in: CoRR (2019)
Keyphrases
- neural network
- training process
- training algorithm
- multi layer perceptron
- feedforward neural networks
- neural network training
- batch mode
- backpropagation algorithm
- feed forward neural networks
- back propagation
- pattern recognition
- neural nets
- deep architectures
- test set
- genetic algorithm
- training phase
- neural network model
- multi layer
- network architecture
- training patterns
- artificial neural networks
- supervised learning
- batch learning
- learning algorithm
- training data
- usage patterns
- fuzzy logic
- sufficient conditions
- self organizing maps
- radial basis function
- multilayer perceptron