Tight Dimension Independent Lower Bound on Optimal Expected Convergence Rate for Diminishing Step Sizes in SGD.
Phuong Ha NguyenLam M. NguyenMarten van DijkPublished in: CoRR (2018)
Keyphrases
- step size
- convergence rate
- lower bound
- competitive ratio
- stochastic gradient descent
- worst case
- upper bound
- convergence speed
- optimal solution
- global convergence
- learning rate
- global optimum
- faster convergence
- primal dual
- gradient method
- evolutionary programming
- mutation operator
- line search
- cost function
- variable step size
- lp norm
- objective function
- np hard
- neural network
- numerical stability
- markov decision processes
- signal processing
- feature space
- faster convergence rate