Login / Signup
Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation.
Taehyeon Kim
Jaehoon Oh
Nakyil Kim
Sangwook Cho
Se-Young Yun
Published in:
CoRR (2021)
Keyphrases
</>
kullback leibler divergence
knowledge discovery
distance measure
mutual information
prior knowledge
information theoretic