Login / Signup
Optimizing Learning Rate Schedules for Iterative Pruning of Deep Neural Networks.
Shiyu Liu
Rohan Ghosh
Chong Min John Tan
Mehul Motani
Published in:
CoRR (2022)
Keyphrases
</>
learning rate
neural network
activation function
hidden layer
backpropagation algorithm
training algorithm
adaptive learning rate
convergence rate
error function
learning algorithm
convergence speed
feed forward neural networks
rapid convergence
fuzzy logic
fuzzy neural network
scheduling problem
training speed
search space
neural nets
recurrent neural networks
artificial neural networks
feed forward
weight vector
neural network model
feedforward neural networks
delta bar delta
particle swarm optimization pso
fault diagnosis
back propagation
high accuracy
multi class
machine learning