Distiller: A Systematic Study of Model Distillation Methods in Natural Language Processing.
Haoyu HeXingjian ShiJonas MuellerSheng ZhaMu LiGeorge KarypisPublished in: SustaiNLP@EMNLP (2021)
Keyphrases
- qualitative and quantitative
- theoretical framework
- computational model
- empirical studies
- statistical models
- study proposes
- hybrid method
- probabilistic model
- natural language processing
- prior knowledge
- statistical model
- theoretical models
- mathematical models
- conceptual model
- machine learning methods
- neural network
- mathematical model
- benchmark datasets
- data mining
- conditional random fields
- text mining
- statistical methods
- simulation study
- theoretical foundation
- computational cost
- recommender systems
- monte carlo simulation
- preprocessing
- social networks
- linear models