Asymptotic Normality for Plug-in Estimators of Generalized Shannon's Entropy.
Jialin ZhangJingyi ShiPublished in: CoRR (2022)
Keyphrases
- information theory
- shannon entropy
- relative entropy
- information theoretic
- asymptotic properties
- rates of convergence
- mutual information
- conditional entropy
- large deviations
- kullback leibler divergence
- information content
- data sets
- finite sample
- website
- minimum error
- unbiased estimator
- confidence intervals
- normal distribution
- objective function
- similarity measure
- case study
- knowledge base
- information systems
- artificial intelligence