Information-Theoretic Bounds and Approximations in Neural Population Coding.
Wentao HuangKechen ZhangPublished in: CoRR (2016)
Keyphrases
- information theoretic
- mutual information
- information theory
- theoretic framework
- upper bound
- information bottleneck
- information theoretic measures
- neural network
- kullback leibler divergence
- kl divergence
- jensen shannon divergence
- competitive learning
- multi modality
- entropy measure
- closed form
- minimum description length
- log likelihood
- kullback leibler
- worst case