Information-theoretic lower bounds for distributed function computation.
Aolin XuMaxim RaginskyPublished in: CoRR (2015)
Keyphrases
- information theoretic
- lower bound
- mutual information
- information theory
- theoretic framework
- information bottleneck
- upper bound
- information theoretic measures
- relative entropy
- kullback leibler divergence
- multi modality
- log likelihood
- jensen shannon divergence
- entropy measure
- minimum description length
- jensen shannon
- kl divergence
- bregman divergences
- image processing
- sample size
- pattern recognition