Knowledge Distillation from Bert in Pre-Training and Fine-Tuning for Polyphone Disambiguation.
Hao SunXu TanJun-Wei GanSheng ZhaoDongxu HanHongzhi LiuTao QinTie-Yan LiuPublished in: ASRU (2019)
Keyphrases
- fine tuning
- viable alternative
- domain knowledge
- fine tuned
- data mining
- knowledge acquisition
- knowledge extraction
- knowledge representation
- prior knowledge
- training set
- fine tune
- co occurrence
- knowledge base
- knowledge based systems
- learning systems
- neural network
- subject matter experts
- training process
- background knowledge
- training examples
- training samples
- knowledge management
- knowledge discovery
- artificial neural networks
- expert systems
- keywords
- case study