A Damped Newton Method Achieves Global $\mathcal O \left(\frac{1}{k^2}\right)$ and Local Quadratic Convergence Rate.
Slavomír HanzelyDmitry KamzolovDmitry PasechnyukAlexander V. GasnikovPeter RichtárikMartin TakácPublished in: NeurIPS (2022)
Keyphrases
- convergence rate
- newton method
- convergence analysis
- global convergence
- convergence speed
- step size
- quasi newton
- learning rate
- variational inequalities
- primal dual
- linear equations
- optimality conditions
- superlinear convergence
- linear svm
- quadratic programming
- regularized least squares
- lower bound
- lower level
- optimization algorithm
- dynamic programming