Login / Signup
Tight Nonparametric Convergence Rates for Stochastic Gradient Descent under the Noiseless Linear Model.
Raphaël Berthier
Francis R. Bach
Pierre Gaillard
Published in:
NeurIPS (2020)
Keyphrases
</>
linear model
stochastic gradient descent
convergence rate
step size
least squares
convergence speed
number of iterations required
learning rate
lower bound
upper bound
linear models
regression model
matrix factorization
worst case
linear svm
loss function
parameter estimation
support vector machine
face recognition