• search
    search
  • reviewers
    reviewers
  • feeds
    feeds
  • assignments
    assignments
  • settings
  • logout

GKD: A General Knowledge Distillation Framework for Large-scale Pre-trained Language Model.

Shicheng TanWeng Lam TamYuanchun WangWenwen GongYang YangHongyin TangKeqing HeJiahao LiuJingang WangShu ZhaoPeng ZhangJie Tang
Published in: CoRR (2023)
Keyphrases
  • language model
  • general knowledge
  • probabilistic model
  • language modeling
  • pre trained
  • n gram
  • information retrieval
  • speech recognition
  • knowledge base
  • domain specific
  • document retrieval