Combinatorial Information Theory: I. Philosophical Basis of Cross-Entropy and Entropy
Robert K. NivenPublished in: CoRR (2005)
Keyphrases
- information theory
- cross entropy
- information theoretic
- log likelihood
- jensen shannon divergence
- shannon entropy
- conditional entropy
- mutual information
- kullback leibler divergence
- maximum likelihood
- relative entropy
- language modeling
- evaluation metrics
- error function
- information geometry
- computer vision
- mdl principle
- ranking functions
- image analysis
- loss function
- language model
- image processing