Training neural networks to have brain-like representations improves object recognition performance.
Callie FedererHaoyan XuAlona FysheJoel ZylberbergPublished in: CoRR (2019)
Keyphrases
- object recognition
- neural network
- training process
- training algorithm
- feedforward neural networks
- feed forward neural networks
- multi layer perceptron
- back propagation
- backpropagation algorithm
- genetic algorithm
- training phase
- pattern recognition
- fuzzy logic
- computer vision
- recurrent networks
- training set
- human brain
- test set
- higher level
- feed forward
- neural network model
- artificial neural networks
- object description
- distributed representations
- training data
- image features
- neural network training
- multi layer
- error back propagation
- recurrent neural networks
- natural images
- visual recognition
- symbolic representation
- brain images
- object representation
- radial basis function network
- object class
- multilayer perceptron
- supervised learning
- object detection
- symbolic knowledge
- dissociated dipoles