A characterization of statistical manifolds on which the relative entropy is a Bregman divergence.
Hiroshi NagaokaPublished in: ISIT (2016)
Keyphrases
- relative entropy
- bregman divergences
- information theoretic
- information theory
- cost sensitive
- maximum entropy
- mutual information
- mahalanobis distance
- exponential family
- log likelihood
- squared euclidean distance
- statistical models
- theoretical guarantees
- loss function
- learning theory
- nearest neighbor
- special case
- low dimensional
- nonnegative matrix factorization
- graphical models
- kl divergence
- data dependent
- boosting algorithms
- high dimensional
- bayesian networks
- feature selection