Login / Signup
Knowledge Distillation Methods for Efficient Unsupervised Adaptation Across Multiple Domains.
Le Thanh Nguyen-Meidine
Atif Belal
Madhu Kiran
Jose Dolz
Louis-Antoine Blais-Morin
Eric Granger
Published in:
CoRR (2021)
Keyphrases
</>
multiple domains
pairwise
knowledge sources
knowledge base
decision trees
case study
high level
prior knowledge
knowledge management
semi supervised learning