Information-theoretic lower bounds for distributed statistical estimation with communication constraints.
Yuchen ZhangJohn C. DuchiMichael I. JordanMartin J. WainwrightPublished in: NIPS (2013)
Keyphrases
- information theoretic
- statistical estimation
- lower bound
- mutual information
- information theory
- theoretic framework
- communication cost
- upper bound
- lower and upper bounds
- information bottleneck
- jensen shannon divergence
- log likelihood
- information theoretic measures
- kullback leibler divergence
- multi modality
- objective function
- entropy measure
- relative entropy
- similarity measure
- min cut
- bregman divergences
- computational learning theory
- optimal solution
- minimum description length
- kl divergence
- data mining