Login / Signup
Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation.
Taehyeon Kim
Jaehoon Oh
Nakyil Kim
Sangwook Cho
Se-Young Yun
Published in:
IJCAI (2021)
Keyphrases
</>
kullback leibler divergence
mutual information
information theoretic
information theory
prior knowledge
kl divergence
knowledge discovery
distance measure
machine learning
probability density function