Tight Nonparametric Convergence Rates for Stochastic Gradient Descent under the Noiseless Linear Model.
Raphaël BerthierFrancis R. BachPierre GaillardPublished in: CoRR (2020)
Keyphrases
- linear model
- stochastic gradient descent
- convergence rate
- step size
- least squares
- convergence speed
- linear models
- lower bound
- number of iterations required
- regression model
- upper bound
- learning rate
- linear svm
- loss function
- worst case
- matrix factorization
- random forests
- maximum likelihood
- cost function
- feature vectors