Leave No Knowledge Behind During Knowledge Distillation: Towards Practical and Effective Knowledge Distillation for Code-Switching ASR Using Realistic Data.
Liang-Hsuan TsengZih-Ching ChenWei-Shun ChangCheng-Kuang LeeTsung-Ren HuangHung-yi LeePublished in: CoRR (2024)
Keyphrases
- raw data
- expert knowledge
- prior knowledge
- knowledge base
- background knowledge
- data mining techniques
- knowledge discovery
- high quality
- synthetic data
- knowledge management
- expert systems
- acquisition process
- knowledge acquisition
- domain knowledge
- image data
- data analysis
- knowledge representation
- domain experts
- knowledge sharing
- semantically rich
- enormous amounts
- feature space
- data collection
- training data
- knowledge sources
- data sets
- knowledge extraction
- relational databases