Sharp Moment-Entropy Inequalities and Capacity Bounds for Symmetric Log-Concave Distributions.
Mokshay MadimanPiotr NayarTomasz TkoczPublished in: IEEE Trans. Inf. Theory (2021)
Keyphrases
- cumulative residual entropy
- kullback leibler divergence
- upper bound
- large deviations
- information theoretic
- sufficient conditions
- information theory
- probability distribution
- mutual information
- error bounds
- lower bound
- information entropy
- upper and lower bounds
- vc dimension
- average case
- linear inequalities
- zernike moments
- joint distribution
- shannon entropy
- bayesian networks
- queue length
- random variables
- worst case
- branch and bound search
- integer solution