Nonlinear Optimization Method Based on Stochastic Gradient Descent for Fast Convergence.
Takahiro WatanabeHitoshi IimaPublished in: SMC (2018)
Keyphrases
- optimization method
- stochastic gradient descent
- nonlinear optimization
- step size
- differential evolution
- convergence speed
- optimization algorithm
- loss function
- matrix factorization
- least squares
- optimization methods
- convergence rate
- genetic algorithm
- evolutionary algorithm
- metaheuristic
- simulated annealing
- nelder mead simplex
- support vector machine
- random forests
- regularization parameter
- multiple kernel learning
- collaborative filtering
- weight vector
- importance sampling
- particle swarm optimization pso
- ant colony optimization
- missing data
- optimization problems
- cost function
- multi objective
- support vector