Convergence of a stochastic subgradient method with averaging for nonsmooth nonconvex constrained optimization.
Andrzej RuszczynskiPublished in: Optim. Lett. (2020)
Keyphrases
- constrained optimization
- subgradient method
- globally convergent
- augmented lagrangian
- objective function
- stationary points
- optimization problems
- constrained optimization problems
- lagrangian dual
- duality gap
- lagrangian relaxation
- lagrange multipliers
- line search
- global convergence
- inequality constraints
- penalty function
- global optimum
- convergence rate
- optimal solution
- autocalibration
- variational inequalities
- regularization term
- evolutionary algorithm
- mathematical programming
- integer programming
- optimization methods
- column generation
- feasible solution
- linear programming
- dynamic programming
- cost function
- lower bound