Knowledge Distillation with Source-free Unsupervised Domain Adaptation for BERT Model Compression.
Jing TianJuan ChenNingjiang ChenLin BaiSuqun HuangPublished in: CSCWD (2023)
Keyphrases
- high level
- prior knowledge
- probabilistic model
- image compression
- statistical model
- mathematical model
- knowledge base
- prediction model
- formal model
- multiresolution
- experimental data
- computational model
- conceptual framework
- data sets
- conceptual model
- knowledge acquisition
- cost function
- objective function
- similarity measure
- e learning
- neural network