Login / Signup
Rademacher dropout: An adaptive dropout for deep neural network via optimizing generalization gap.
Haotian Wang
Wenjing Yang
Zhenyu Zhao
Tingjin Luo
Ji Wang
Yuhua Tang
Published in:
Neurocomputing (2019)
Keyphrases
</>
neural network
generalization bounds
data dependent
artificial neural networks
pattern recognition
adaptive fuzzy
prediction model
recurrent neural networks
machine learning
neural network model
fault diagnosis
multilayer perceptron
feed forward
neural nets
adaptive learning
back propagation
active learning