Efficient Knowledge Distillation for RNN-Transducer Models.
Sankaran PanchapagesanDaniel S. ParkChung-Cheng ChiuYuan ShangguanQiao LiangAlexander GruensteinPublished in: CoRR (2020)
Keyphrases
- prior knowledge
- domain knowledge
- semantic models
- nearest neighbor
- statistical models
- machine learning
- formal models
- computationally expensive
- complex systems
- knowledge management
- knowledge representation
- knowledge base
- computationally efficient
- back propagation
- domain experts
- knowledge acquisition
- statistical model
- experimental data
- knowledge transfer
- knowledge extraction
- neural network