An Empirical Study of Iterative Knowledge Distillation for Neural Network Compression.
Sharan YalburgiTirtharaj DashRamya HebbalaguppeSrinidhi HegdeAshwin SrinivasanPublished in: ESANN (2020)
Keyphrases
- neural network
- knowledge acquisition
- knowledge representation
- domain knowledge
- machine learning
- artificial neural networks
- prior knowledge
- data mining techniques
- knowledge management
- knowledge base
- knowledge discovery
- genetic algorithm
- knowledge sharing
- knowledge transfer
- pattern recognition
- expert systems
- video sequences
- data sets
- multiscale
- bayesian networks
- training data
- image quality
- neural network model
- multi layer
- iterative methods