• search
    search
  • reviewers
    reviewers
  • feeds
    feeds
  • assignments
    assignments
  • settings
  • logout

Weight Distillation: Transferring the Knowledge in Neural Network Parameters.

Ye LinYanyang LiZiyang WangBei LiQuan DuTong XiaoJingbo Zhu
Published in: ACL/IJCNLP (1) (2021)
Keyphrases
  • network parameters
  • neural network
  • prior knowledge
  • knowledge base
  • pattern recognition
  • machine learning
  • decision trees
  • graphical models
  • data fusion
  • transfer learning
  • network architecture