FastHebb: Scaling Hebbian Training of Deep Neural Networks to ImageNet Level.
Gabriele LaganiClaudio GennaroHannes FassoldGiuseppe AmatoPublished in: CoRR (2022)
Keyphrases
- neural network
- training process
- training algorithm
- feed forward neural networks
- deep architectures
- feedforward neural networks
- back propagation
- learning rules
- associative memory
- neural network training
- hebbian learning
- artificial neural networks
- backpropagation algorithm
- recurrent networks
- multi layer perceptron
- biologically inspired
- training phase
- recurrent neural networks
- activation function
- image collections
- feed forward
- linear svm
- supervised learning
- multi layer
- test set
- higher level
- fuzzy logic
- d objects
- hidden markov models