Login / Signup

Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation.

Taehyeon KimJaehoon OhNakyil KimSangwook ChoSe-Young Yun
Published in: IJCAI (2021)
Keyphrases
  • kullback leibler divergence
  • mutual information
  • information theoretic
  • information theory
  • prior knowledge
  • kl divergence
  • knowledge discovery
  • distance measure
  • machine learning
  • probability density function