Global Convergence of Gradient Descent for Deep Linear Residual Networks.
Lei WuQingcan WangChao MaPublished in: NeurIPS (2019)
Keyphrases
- global convergence
- global optimum
- optimization methods
- convergence speed
- convergence analysis
- convergence rate
- convex minimization
- constrained optimization problems
- objective function
- coordinate ascent
- line search
- newton method
- social networks
- loss function
- cost function
- conjugate gradient
- gauss newton
- hybrid algorithm
- sensitivity analysis
- step size
- optimization problems
- simulated annealing
- globally convergent