A new information-theoretic lower bound for distributed function computation.
Aolin XuMaxim RaginskyPublished in: ISIT (2014)
Keyphrases
- information theoretic
- lower bound
- mutual information
- information theory
- upper bound
- theoretic framework
- information bottleneck
- log likelihood
- information theoretic measures
- objective function
- jensen shannon divergence
- minimum description length
- kl divergence
- entropy measure
- multi modality
- kullback leibler divergence
- optimal solution
- computational learning theory
- bregman divergences
- relative entropy
- maximum entropy
- support vector