Neural Attention Distillation: Erasing Backdoor Triggers from Deep Neural Networks.
Yige LiXixiang LyuNodens KorenLingjuan LyuBo LiXingjun MaPublished in: CoRR (2021)
Keyphrases
- neural network
- network architecture
- artificial neural networks
- back propagation
- learning rules
- pattern recognition
- neural model
- feed forward
- associative memory
- neural learning
- fuzzy logic
- neural fuzzy
- genetic algorithm
- connectionist models
- neural architectures
- auto associative
- focus of attention
- active databases
- multi layer perceptron
- fuzzy systems
- fault diagnosis
- activation function
- feed forward neural networks
- multi layer
- recurrent networks
- neural nets
- hebbian learning
- self organizing maps
- distributed representations
- neural computation
- learning algorithm