Login / Signup
On a 2-relative entropy.
James Fullwood
Published in:
CoRR (2021)
Keyphrases
</>
relative entropy
information theoretic
information theory
log likelihood
mutual information
covariance matrix
mahalanobis distance
kullback leibler divergence
bregman divergences
active learning
maximum entropy
bayesian networks
high dimensional
squared euclidean distance