Global Convergence of Stochastic Gradient Hamiltonian Monte Carlo for Nonconvex Stochastic Optimization: Nonasymptotic Performance Bounds and Momentum-Based Acceleration.
Xuefeng GaoMert GürbüzbalabanLingjiong ZhuPublished in: Oper. Res. (2022)
Keyphrases
- monte carlo
- stochastic optimization
- global convergence
- stochastic gradient
- convergence rate
- step size
- learning rate
- convergence speed
- global optimum
- optimization methods
- multistage
- objective function
- upper bound
- markov chain
- lower bound
- importance sampling
- optimization problems
- global optimization
- convex optimization
- particle swarm optimization
- robust optimization
- pso algorithm
- worst case
- least squares
- differential evolution
- cost function
- multi objective
- evolutionary algorithm
- machine learning
- subband
- particle filter
- linear programming
- learning algorithm