Learning to Augment for Data-Scarce Domain BERT Knowledge Distillation.
Lingyun FengMinghui QiuYaliang LiHai-Tao ZhengYing ShenPublished in: CoRR (2021)
Keyphrases
- background knowledge
- prior knowledge
- domain experts
- high quality
- knowledge discovery
- data sets
- learning systems
- learning algorithm
- data points
- data collection
- raw data
- knowledge transfer
- data sources
- learning process
- prior domain knowledge
- human experts
- complex domains
- database
- enormous amounts
- data mining techniques
- data analysis
- data processing
- domain specific
- image data
- data quality
- meta knowledge
- training data
- online learning
- domain independent
- expert knowledge
- concept maps
- learning models
- knowledge representation
- domain expertise
- machine learning