Login / Signup
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels.
Zhendong Yang
Ailing Zeng
Zhe Li
Tianke Zhang
Chun Yuan
Yu Li
Published in:
CoRR (2023)
Keyphrases
</>
domain knowledge
knowledge representation
knowledge base
knowledge acquisition
knowledge extraction
neural network
data mining
high level
pairwise
evolutionary algorithm
data sets
training data
expert systems
prior knowledge
semi supervised
knowledge based systems