Login / Signup
Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer.
Zhiyuan Wu
Yu Jiang
Minghao Zhao
Chupeng Cui
Zongmin Yang
Xinhui Xue
Hong Qi
Published in:
KSEM (2021)
Keyphrases
</>
knowledge transfer
similarity measure
prior knowledge
probabilistic model
cross domain
multi domain
information technology
nearest neighbor
unsupervised learning
learning experience
transfer learning
network structure