Generalized Bayesian Cramér-Rao Inequality via Information Geometry of Relative $α$-Entropy.
Kumar Vijay MishraM. Ashok KumarPublished in: CoRR (2020)
Keyphrases
- relative entropy
- information geometry
- information theory
- bregman divergences
- information theoretic
- kullback leibler divergence
- maximum entropy
- cost sensitive
- learning theory
- mutual information
- mahalanobis distance
- loss function
- theoretical guarantees
- data dependent
- covariance matrix
- log likelihood
- gaussian mixture model
- kl divergence
- nearest neighbor
- nonnegative matrix factorization
- bayesian networks
- posterior distribution
- bayesian inference
- multi class
- special case
- objective function