Neural Attention Distillation: Erasing Backdoor Triggers from Deep Neural Networks.
Yige LiXixiang LyuNodens KorenLingjuan LyuBo LiXingjun MaPublished in: ICLR (2021)
Keyphrases
- active databases
- neural network
- network architecture
- active rules
- pattern recognition
- learning rules
- connectionist models
- artificial neural networks
- fuzzy logic
- neural model
- associative memory
- visual attention
- back propagation
- neural learning
- multi layer
- fuzzy systems
- neural nets
- feed forward
- neural network model
- neural computation
- neural models
- fault diagnosis
- neural architectures
- spiking neural networks
- pattern languages
- bio inspired
- training algorithm
- radial basis function
- artificial intelligence