Login / Signup
Prompt-Distiller: Few-Shot Knowledge Distillation for Prompt-Based Language Learners with Dual Contrastive Learning.
Boyu Hou
Chengyu Wang
Xiaoqing Chen
Minghui Qiu
Liang Feng
Jun Huang
Published in:
ICASSP (2023)
Keyphrases
</>
language learners
learning algorithm
learning process
reinforcement learning
knowledge transfer
e learning
pattern recognition
active learning
knowledge representation
language learning
case study
online learning
computer systems
mobile learning
video content