Login / Signup
Diverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks.
Ali Mirzaeian
Jana Kosecka
Houman Homayoun
Tinoosh Mohsenin
Avesta Sasan
Published in:
ISQED (2021)
Keyphrases
</>
prior knowledge
probabilistic model
knowledge representation
domain knowledge
knowledge based systems
artificial neural networks
numerical methods
real world
statistical models
multi agent
training data
access control
model selection
knowledge base
information systems
countermeasures
data mining