Login / Signup
New Bounds of a Measure in Information Theory.
Mihaela Alexandra Popescu
Oana Slusanschi
Alexandru-Corneliu Olteanu
Florin Pop
Published in:
HPCC/CSS/ICESS (2014)
Keyphrases
</>
information theory
information theoretic
rate distortion theory
jensen shannon divergence
statistical mechanics
shannon entropy
conditional entropy
statistical learning
kullback leibler divergence
relative entropy
statistical physics
worst case
mdl principle
lower bound
image processing
kl divergence