Login / Signup
Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism.
Chenqi Guo
Shiwei Zhong
Xiaofeng Liu
Qianli Feng
Yinglong Ma
Published in:
CoRR (2024)
Keyphrases
</>
domain knowledge
knowledge base
knowledge management
knowledge extraction
visual attention
computational model
knowledge acquisition
knowledge discovery
knowledge representation
expert systems
knowledge based systems
natural language
background knowledge
knowledge sharing
neural network
expert knowledge
real time