HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression.
Chenhe DongYaliang LiYing ShenMinghui QiuPublished in: EMNLP (1) (2021)
Keyphrases
- language model
- cross domain
- knowledge transfer
- language modeling
- n gram
- multiple domains
- probabilistic model
- retrieval model
- information retrieval
- smoothing methods
- transfer learning
- knowledge sharing
- query expansion
- ad hoc information retrieval
- prior knowledge
- text categorization
- error rate
- mixture model
- knowledge management
- co occurrence
- translation model
- data mining