Training of spiking neural networks based on information theoretic costs.
Oleg Y. SinyavskiyPublished in: CoRR (2016)
Keyphrases
- information theoretic
- spiking neural networks
- information theory
- mutual information
- theoretic framework
- training algorithm
- information bottleneck
- biologically inspired
- information theoretic measures
- kullback leibler divergence
- relative entropy
- spiking neurons
- entropy measure
- feed forward
- training process
- back propagation
- jensen shannon divergence
- kl divergence
- image classification
- object recognition
- decision trees