Gradient-trained Weights in Wide Neural Networks Align Layerwise to Error-scaled Input Correlations.
Akhilan BoopathyIla FietePublished in: CoRR (2021)
Keyphrases
- neural network
- hidden units
- number of hidden units
- multilayer perceptron
- hidden layer
- rbf network
- synaptic weights
- pattern recognition
- training process
- feed forward neural networks
- radial basis function
- multi layer perceptron
- activation function
- back propagation
- artificial neural networks
- output layer
- trained neural networks
- error rate
- training error
- trained neural network
- neural network model
- backpropagation algorithm
- wide range
- fuzzy logic
- error analysis
- fuzzy neural network
- fuzzy systems
- edge detection
- learning rate
- genetic algorithm
- desired output
- feature space
- training set
- neural learning
- linear combination
- error bounds
- nonlinear functions
- feed forward
- fault diagnosis
- error function
- auto associative
- input data
- rbf neural network
- multi layer
- weight update
- learning algorithm
- weighted sum