Estimating the Entropy of Linguistic Distributions.
Aryaman AroraClara MeisterRyan CotterellPublished in: ACL (2) (2022)
Keyphrases
- kullback leibler divergence
- cumulative residual entropy
- mutual information
- information theoretic
- information theory
- natural language
- rigid body
- probability distribution
- shannon entropy
- neural network
- power law
- information entropy
- joint distribution
- exponential distributions
- entropy measure
- estimation process
- accurate estimation
- information content
- natural language processing
- bayesian networks
- case study