Login / Signup
FastAdaBelief: Improving Convergence Rate for Belief-Based Adaptive Optimizers by Exploiting Strong Convexity.
Yangfan Zhou
Kaizhu Huang
Cheng Cheng
Xuguang Wang
Amir Hussain
Xin Liu
Published in:
IEEE Trans. Neural Networks Learn. Syst. (2023)
Keyphrases
</>
convergence rate
variable step size
step size
convergence speed
learning rate
global convergence
mutation operator
primal dual
gradient method
belief revision
wavelet neural network
numerical stability
neural network
denoising
belief functions