The convergence of the Stochastic Gradient Descent (SGD) : a self-contained proof.
Gabriel TuriniciPublished in: CoRR (2021)
Keyphrases
- stochastic gradient descent
- update rule
- step size
- stochastic gradient
- least squares
- matrix factorization
- loss function
- convergence rate
- convergence speed
- online algorithms
- random forests
- multiple kernel learning
- weight vector
- support vector machine
- alternating least squares
- importance sampling
- cost function
- regularization parameter
- learning rate
- machine learning methods
- online learning