An information-theoretic proof of the Shannon-Hagelbarger theorem.
Venkat AnantharamPublished in: CoRR (2023)
Keyphrases
- information theoretic
- information theory
- interactive theorem proving
- mutual information
- jensen shannon divergence
- theoretic framework
- shannon entropy
- information theoretic measures
- entropy measure
- log likelihood
- relative entropy
- kullback leibler divergence
- multi modality
- minimum description length
- mdl principle
- von neumann
- computer vision
- kl divergence
- information bottleneck