A new non-adaptive optimization method: Stochastic gradient descent with momentum and difference.
Wei YuanFei HuLiangfu LuPublished in: Appl. Intell. (2022)
Keyphrases
- optimization method
- stochastic gradient descent
- optimization algorithm
- simulated annealing
- evolutionary algorithm
- matrix factorization
- genetic algorithm
- least squares
- differential evolution
- metaheuristic
- nelder mead simplex
- loss function
- step size
- random forests
- nonnegative matrix factorization
- online algorithms
- cross validation
- learning rate
- multiple kernel learning
- cost function
- support vector