Bregman Divergence Bounds and the Universality of the Logarithmic Loss.
Amichai PainskyGregory W. WornellPublished in: CoRR (2018)
Keyphrases
- loss bounds
- bregman divergences
- worst case
- regret bounds
- cost sensitive
- maximum entropy
- expert advice
- theoretical guarantees
- temporal difference learning
- information theoretic
- learning theory
- mahalanobis distance
- linear regression
- exponential family
- nearest neighbor
- upper bound
- loss function
- lower bound
- kl divergence
- np hard
- maximum likelihood
- data dependent
- special case
- missing data
- knn
- missing values
- decision trees
- sample size
- principal component analysis
- bayesian networks