Information theory meets circuit design: Why capacity-approaching codes require more chip area and power.
Pulkit GroverAndrea GoldsmithAnant SahaiJan M. RabaeyPublished in: Allerton (2011)
Keyphrases
- circuit design
- information theory
- information theoretic
- statistical learning
- conditional entropy
- jensen shannon divergence
- power consumption
- design automation
- shannon entropy
- statistical mechanics
- digital circuits
- kullback leibler divergence
- relative entropy
- motion estimation
- mdl principle
- statistical physics
- image segmentation
- pattern recognition