Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression.
Cody BlakeneyXiaomin LiYan YanZiliang ZongPublished in: CoRR (2020)
Keyphrases
- neural network
- domain knowledge
- knowledge representation
- data compression
- knowledge base
- artificial neural networks
- knowledge management
- compression algorithm
- higher level
- knowledge acquisition
- data mining techniques
- genetic algorithm
- scheduling problem
- knowledge discovery
- knowledge based systems
- prior knowledge
- background knowledge
- neural network model
- parallel processing
- data mining