Concentration of the multinomial in Kullback-Leibler divergence near the ratio of alphabet and sample sizes.
Rohit AgrawalPublished in: CoRR (2019)
Keyphrases
- sample size
- kullback leibler divergence
- fisher information
- probability density function
- mutual information
- information theoretic
- model selection
- information theory
- upper bound
- random sampling
- distance measure
- expectation maximization
- text classification
- marginal distributions
- covariance matrix
- text categorization
- worst case
- machine learning
- probabilistic model
- naive bayes
- em algorithm
- gaussian mixture model
- learning algorithm
- probability distribution
- semi supervised
- parameter estimation
- maximum likelihood