Login / Signup
Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer.
Zhiyuan Wu
Yu Jiang
Minghao Zhao
Chupeng Cui
Zongmin Yang
Xinhui Xue
Hong Qi
Published in:
CoRR (2021)
Keyphrases
</>
probabilistic model
knowledge transfer
multi domain
prior knowledge
cross domain
similarity measure
unsupervised learning
data mining
social networks
social network analysis
knowledge sharing