Breaking Batch Normalization for better explainability of Deep Neural Networks through Layer-wise Relevance Propagation.
Mathilde GuillemotCatherine HeuseleRodolphe KorichiSylvianne SchnebertLiming ChenPublished in: CoRR (2020)
Keyphrases
- neural network
- multi layer
- multiple layers
- single layer
- pattern recognition
- artificial neural networks
- fuzzy logic
- relevance feedback
- fault diagnosis
- preprocessing
- feed forward
- feed forward neural networks
- fuzzy systems
- neural nets
- data sets
- back propagation
- pairwise
- network architecture
- application layer
- wave propagation
- test collection
- recurrent neural networks
- web search
- expert systems
- feedforward neural networks
- cellular neural networks
- batch mode
- decision trees
- machine learning
- normalization method