Is normalization indispensable for training deep neural network?
Jie ShaoKai HuChanghu WangXiangyang XueBhiksha RajPublished in: NeurIPS (2020)
Keyphrases
- neural network
- training algorithm
- training process
- feedforward neural networks
- multi layer perceptron
- artificial neural networks
- feed forward neural networks
- back propagation
- training patterns
- neural network training
- genetic algorithm
- neural network structure
- train a neural network
- neural network model
- preprocessing
- adaptive learning rate
- hidden layer
- pattern recognition
- backpropagation algorithm
- training set
- control system
- decision trees
- multilayer neural network
- deep architectures
- bp network
- back propagation neural network
- radial basis function
- online learning
- activation function
- unsupervised learning
- fault diagnosis
- self organizing maps
- learning rate
- prediction model
- multilayer perceptron