FastAdaBelief: Improving Convergence Rate for Belief-based Adaptive Optimizer by Strong Convexity.
Yangfan ZhouKaizhu HuangCheng ChengXuguang WangXin LiuPublished in: CoRR (2021)
Keyphrases
- convergence rate
- variable step size
- convergence speed
- step size
- learning rate
- gradient method
- mutation operator
- optimization algorithm
- global convergence
- primal dual
- query optimization
- faster convergence rate
- belief revision
- wavelet neural network
- particle swarm optimization algorithm
- gravitational search algorithm
- differential evolution