Information-Theoretic Bounds and Approximations in Neural Population Coding.
Wentao HuangKechen ZhangPublished in: Neural Comput. (2018)
Keyphrases
- information theoretic
- information theory
- mutual information
- theoretic framework
- information bottleneck
- neural network
- jensen shannon divergence
- upper bound
- log likelihood
- information theoretic measures
- multi modality
- kullback leibler divergence
- minimum description length
- competitive learning
- entropy measure
- image registration
- computer vision
- machine learning