Adaptivity provably helps: information-theoretic limits on $l_0$ cost of non-adaptive sensing.
Sanghamitra DuttaPulkit GroverPublished in: CoRR (2016)
Keyphrases
- information theoretic
- information theory
- mutual information
- theoretic framework
- information bottleneck
- jensen shannon divergence
- information theoretic measures
- log likelihood
- multi modality
- entropy measure
- kullback leibler divergence
- minimum description length
- relative entropy
- theoretical guarantees
- computational learning theory
- kl divergence
- image analysis
- distributional clustering
- jensen shannon