Information-theoretic lower bounds for distributed statistical estimation with communication constraints.
John C. DuchiMichael I. JordanMartin J. WainwrightYuchen ZhangPublished in: CoRR (2014)
Keyphrases
- information theoretic
- statistical estimation
- lower bound
- information theory
- mutual information
- lower and upper bounds
- information bottleneck
- upper bound
- communication cost
- theoretic framework
- objective function
- min cut
- relative entropy
- jensen shannon divergence
- information theoretic measures
- entropy measure
- kullback leibler divergence
- image segmentation
- log likelihood
- multi modality
- computational learning theory
- minimum description length
- kl divergence
- vc dimension
- learning theory
- medical images
- knn
- distributional clustering