NEO-KD: Knowledge-Distillation-Based Adversarial Training for Robust Multi-Exit Neural Networks.
Seokil HamJungwuk ParkDong-Jun HanJaekyun MoonPublished in: NeurIPS (2023)
Keyphrases
- neural network
- training process
- training algorithm
- knowledge discovery
- knowledge management
- knowledge acquisition
- domain knowledge
- feed forward neural networks
- data mining techniques
- feed forward
- knowledge sources
- test set
- back propagation
- knowledge based systems
- feedforward neural networks
- training set
- training phase
- backpropagation algorithm
- recurrent neural networks
- domain experts
- training examples
- higher level
- supervised learning
- fuzzy logic
- active learning